Does anyone know how much G-sync monitors will be?

Just as the subject says does anyone know?

Now that the 280xs are $400 I'm looking at the GTX 770s. G-sync would be really cool to have.

Probably $300-$400 for starting models, but i heard you may be able to buy a kit and upgrade your current monitor with G-sync

Yeah i thought that was what GSync was rather than actual monitors. I thought it was just a little addon for your screen.

They demonstrated it on some 144hz ASUS monitor that was jerey rigged with a Gsync card/module/thingy. It was really a rough product then in how it was fitted... But there is a plug in coming to put inside your monitor to support it. I don't knwo much on monitor building so how that'll work i don't exactly know... They also said there will be monitors from major manufactures (ASUS, BENQ, etc...) with the card built in from the factory. Because of how Nvidia is doing it there probably won't be a cheap ready made small brand monitor with it built in. So probably no X-star G-sync ready made monitors.

G-sync is a rip-off. Most modern monitors have variable "response time". Very cheap monitors have such an option in their menu. This option has two presets usually, "normal", which corresponds to 30 fps, and "gaming" or "fast", which corresponds to 60 Hz/60 fps. The only thing that need to be done to make it into a "G-sync" monitor, is to make a small script that modifies the VESA code from 60 Hz to 30 Hz as soon as the fps rating of your game is below 60 fps, and set the adapter to refresh after finishing each frame. These are bog standard linux functions that are completely for free, and included in many open source games and emulators in linux, which is why the games look so good. I wouldn't spend a penny on G-sync, it's just another nVidia rip-off. Screen tearing is a bug, it's something that graphics cards manufacturers don't solve because they want to benchmark better, but fact is that the human eye can't see more than 30 fps, and that 60 fps evens out even the most extreme motion blur, if the game can be rendered with 60 finished frames per second, it's as good as it gets, but how many Windows games can actually do that? The fact that nVidia brings out G-Sync now, is just a testament to the fact that they have been benchmarking unfinished frames instead of making sure users would get 30 to 60 finished frames per second. It's ludicrous.

I was playing Xonotic earlier with some friends. The fps is way over 250 on ultra in linux with a modern GPU card (I have over 500 fps with a 7850, everything on ultra in 1080p). A friend was complaining about screen tearing on his laptop with Intel HD graphics, but after enabling "wait for GPU to finish each frame", the GPU was still spitting out over 130 fps into a 60 Hz TN screen, but there was absolutely no screen tearing, because each frame was entirely rendered. After setting everything to ultra, the Intel HD graphics couldn't render 60 fps, and some motion stuttering appeared, so we set the the VESA code to 30 Hz, which evenly spread the double frames, and the motion stuttering was gone and everything looked perfectly smooth. G-sync is a reversed solution that makes no sense, it solves a problem for nVidia, not for the user.

Any cheap monitor can be pumped up very easily to about 75 Hz, but even that isn't worth doing, as the human eye can't even see the difference between 30 and 60 fps. What you think you see when you compare the two, is not the performance of the monitor, but in 30 Hz, some of the screen tearing due to the fact that the GPU is not finishing frames to benchmark better, is still visible on the screen because the screen, whereas at 60 Hz, you don't even see it any more. If you tell the application to not issue a call until the GPU is done rendering each frame, all problems are automatically solved. You won't benchmark as fast, but it'll look better. If you can't reach 60 fps, which makes 60 Hz too high for perfect rendering on your monitor, you can tell the monitor to follow the fps of the GPU, which is what G-sync does, but that will not cure screen tearing, it will only make the movements more fluid because you won't have double frames on your monitor, but a better solution would be to tell the GPU to spit out refreshes at 30 Hz, and change the VESA code, because that will also solve the motion fluidity problem, but it will allow the GPU to finish each frame, and there will be no more screen tearing. It will also greatly reduce the stress on the hardware, especially on the monitor. To be honest, I run my monitors at 30 Hz all of the time, it keeps everything cool and smooth, and I let the application wait for each frame to fully render, to prevent screen tearing. Thing is, that's a piece of cake to set up with Intel and AMD GPUs, but for some reason, it doesn't quite work on nVidia GPUs with the proprietary driver (whereas with the nouveau driver it looks normal!), I even get screen tearing when closing windows on the linux desktop, because the fucking nVidia crap driver can't interpret standard formatted OpenGL calls. I find it despicable that nVidia once again is making users pay extra to hide their own shortcomings.

Read next time, please.

Okay, seriously, I'm tired of the stupid idea that the human eye can't see over 30fps.  Seriously, have you EVER played a game at 30fps and then again at 60fps?!!!  There's a HUGE difference, 30 frames is pretty choppy, while 60 frames is very smooth.  Push that up to 75, 90, or even 120fps (yes you can see a difference) it gets even smoother.  The thing is, the human eye does not see in frames per second, it is constantly getting information, it's simply how fast your brain can translate that information.

Now, I know what you're probably going to say, "but movie and TV shows are only shot at 24fps!"  Yes we know, but the reason they look smooth is because the films, and more importantly, the human eye, use a motion blurring effect to make it look smooth.  I'm really tired of people spreading that lie about only being able to see at 30fps.  Play a game at 30fps, then again at 60fps.  You will see the difference.  Also, do your research, you'll find that the predicted maximum frame rates the eye can depict/translate (well your brain does the translating) is somewhere greater than 200.

Screen tearing is not a GPU bug, it's a problem caused by the fact that monitors and TVs refresh at a fixed rate, whereas the GPU refreshes at a variable rate.  Monitors were designed to refresh at a fixed rate a long time ago before 3D graphics were a big thing, because there was no need for variable refresh rates.  However, once powerful 3D applications came along and powerful graphics processors followed, application frame rates became variable, but monitor manufacturers did not make monitor frame rates variable as well.  This causes the issues you get with screen tearing and laggy gameplay.  Your GPU is displaying at different frame rates, but your monitor is displaying at a fixed frame rate, and that causes issues.

http://www.tomshardware.com/reviews/g-sync-v-sync-monitor,3699.html

Lolz mate, the "choppiness" at 30 fps is because of unevenly spreaded double frames over the lower refresh rate, not because the human eye can see even more than 25 fps actually. The "stuttering" and "tearing" above 30 fps but below 60/75 fps is caused by unfinished frames. There is no way anyone can see more than 30 fps, in fact, G-sync will cause more artefacts than just finetuning your graphics to 30 fps without irregular double frames (hence adapting the VESA code) and without screen tearing (hence telling your application to wait until the GPU has rendered each complete frame). G-sync is the world on its head, it's selling people even more stuff they don't need to hide the artefacts that are already caused by the fact that they are selling people stuff they don't need, in other words, it's selling people extra stuff to hide the bugs they won't solve to look better in benchmarks that already are absurd because nobody can see the difference anyway if there weren't those bugs... so now you know why the bugs are there... it's the basis of all commercial soft- and hardware: include bugs so that you can convince people to buy new stuff all of the time.

Do the test for yourself: watch content with your VESA code calculated for 60 Hz, then for 30 Hz, you will not be able to see any difference at all. Even if you calculate for 25 Hz, you still won't be able to see any difference at all. The reason for that 60 Hz has nothing to do with the graphics experience, the problem is that if you film something (like movies do), the shutter speed of the camera limits the amount of frames that are displayable, and even at the movie-standard 24 fps, the shutter speed is about 1/30 of a second, which means that there will be quite a lot of motion blur registered on the individual frames, whereas with rendered content, there is absolutely no motion blur on the frames because the shutter speed doesn't matter, so it always looks artificial, because the viewer's eyes see motion like a cinema camera, but the brain filters out the motion blur. That's why games look so good on CRTs, not because of latency as such, but because the CRT has reminiscence, which makes the unnatural rendered frames look better because it introduces blur, that the viewer's brain then filters out again. CRTs show a lot less pictures per second than flatpanels, but still they are preferred for the gaming experience. The reason why 60 Hz is the standard in refresh frequencies, is also a CRT heritage, because a CRT screen is filled from beginning to end, not all pixels at the same time, so in order to suppress the "wiping" artefact, it was decided to refresh the screen at twice the maximum refresh rate the human eye can discern, so that the beam that illuminates the individual pixels, can move front to back and back again within every refresh of the human eye, so that the screen looks evenly illuminated. The first LCD's were 75 Hz or higher, but later the industry decided to bring that down to 60 Hz. Just stick to scientific facts please, nVidia does not sell cyber-genetic augmentations, it just wants the consumer's hard-earned cash like any other corporation... and it certainly doesn't change the way the human eye works. Simple solution to not have artefacts in games: use a single-channel DVI cable, set the mode with the VESA code calculated for 30 Hz, set the monitor to normal response time, set the application to wait for each frame to finish rendering... voilà, no more artefacts, no more motion stuttering, everything still looks artificial because there is no motion blur per-frame, but it looks smooth, better than G-sync. Also don't buy too expensive graphics cards, they will only make artefacts worse, buy graphics cards that display all your games between 30 and 40 fps, that's the sweet spot. In Windows, there are a lot of freezes, so graphics card manufacturers keep selling more powerful cards to customers telling them they need more than 60 fps, whereas the only thing they need is to get rid of the freeze that from time to time bring down the fps to below 24 fps. Sadly, you'll always have these, they are the consequence of Windows, the ancient and handicapped filesystem it uses, the utter lack on inherent performance, the malware that runs behind your back and bogs the system down, etc... but hey, it works, everyone spends insane amount son insane graphics cards. When you upgrade your graphics card and think that you have a much better viewing experience, it's not the fps ceiling of the card that makes the difference, it's the fps bottom. That fps bottom is often artificially manipulated by the closed source drivers for those cards, because once you use the same cards with open source drivers, the fps ceiling is lower, but the fps bottom is higher, and the gaming experience is often better than with the secret code driver. Don't believe marketing from sponsored or blind sites, believe science!