Has anyone noticed?

Dunno.

My 1080ti FE has been absolutely rock solid. I get to 2.0GHz with no issues at all. No fiddling with settings or anything. Just fire up precision X and set the power limit to %120, put 200MHz onto the boost and 500 onto the memory and away it goes. Temps never get above 40C.

I want to say 35C, but I don't think I've pushed the card hard enough to be able to make that claim. Three 60hz 1080p monitors only requires so much power.

(Running it watercooled with a 480mm and 120mm radiator)

Don't say you dunno... You absolutely know. You bought a 1080ti which proves your a knowledgable individual. Evidently you didn't buy into AMD's marketing hype (as I did) and therefore you saved yourself a great deal of disappointment. I'm happy with your card's younger little brother and because I'm not an intensive gamer it suits me fine. I'm sure that w.c.card will last you years (provided it doesn't spring a leak) and no panic because year after year the message from AMD has been "yeah, we are the cutting edge budget people who charge more for our top of the line budget cards than NVIDIA charges for their top end high performance cards" so not to worry it's not like AMD caters to the little guy. AMD caters to the cryptos while their fan boys fight tooth and nail to defend their false god that has never managed to make a top end card that tops the NVIDIA top end. Alas, in the end all idols fall and I am no longer an AMD fan boy. I'm done with this ideology of idolatry. I'm just happy I finally have a decent graphics card. (Eventhough I'm $200+ over the budget for it.) *Edited because I'm still learning how to spell "evidently"

HA! they finally brought down the price! See? Whistle blowing helps. :smiley:

I mean yeah, it's top end.

I'm still iffy on it because I'm only running a 5760x1080 60fps setup. I want/need to upgrade to 1440 or 4k to really see what this card can do. The difference between this card and a 970 is big, but still not really all that big considering what I have to display on.

(That being said, any game I play is just glued to 60fps at maximum settings, the only exception thus far being Metro LL Redux. Which if I turn everything on will drop to around 30fps. Although that might be me turning the wrong settings on or something)

Also @KuramaKitsune is selling some used 1080's at a decent price.

I wouldn't worry about it the human brain cannot even distingush the difference between 40 and 60. The real thing you will notice is the throttling and tearing which you will probably agree is virtually non-existent with the kind of gpu you have in that graphics card. I'm happy with my Standard ASUS 27" monitor for the time being but later I might go 4K as the prices on those monitors continue to drop. I definitely have the card to handle it albeit maximum settings for gaming could be tricky in some cases with an EVGA GTX1070 FTW. Still, I have to say I have AMD to thank for helping me get a real card. I would never have shelled out the extra 250 bux to NVIDIA if it weren't for their negligence ::::GRINS:::: WOW... What a difference between a budget card and a top end, eh? :wink:

So much yes.. Especially to all the ridiculous AMD fanboys on this forum holy crap lol

yes it can.

1 Like

I'm talking about going from 60fps to 144

Goblin, it is possible to train your brain to see these distinctions but it is hardly average. The real concern is throttling and tearing as I stated in the full quotation. Howbeit, I sit corrected. When trained the human brain can notice these distinctions as gamers represent “a really weird population of people who are probably operating close to maximal levels [of vision]” -
Assistant professor Jordan DeLong - as stated in this article:

Myself, I probably don’t need anything over 60 FPS anyway as I am more interested in video rendering than I am with hard core gaming; but to each their own. I’m very pleased AMD drove me away with their negligence and generated the sort of economic climate in the world of graphics cards that (for all intents and purposes) FORCED me to get a top end card as opposed to something newer, but arguablly inferior to an EVGA GTX 1070 FTW. I could care less now if AMD sold all their latest releases to the Chinese Crypto Crowd before any retailers in North America even see them. It wouldn’t be the first time this has happened. GO AMD!!! lol lol lol

okay.
But I’m going to guess you’ve also probably never sat down with a 144hz monitor for long periods, and thus don’t know what that’s like.

It’s not about some superhuman ability to see anything. It’s about what looks and feels better. The argument you’ve trotted out is one frequently made by people who have played on consoles or gpus that output less or equal to 60 fps, who, by never having experienced anything else, can’t possibly imagine what all the fuss is about.

The same is true for audio. Why bother with multiple hundreds of dollars headphones, and an amp, when earbuds still let you hear your music? Why bother with mp3’s at 320 kbps when 128 is “just as good”? Why bother buying vinyl when a CD is cheaper?

The problem isn’t that some people prefer 30fps “”""“for artistic reasons”"""", it’s being told that what you prefer is silly and not real. When no, it’s not silly. It’s not a fantasy we’ve concocted to lord over people. No, if you think 60 fps is enough, viewing a game in 120/44 fps isn’t going to blow your skull apart with blinding awesome. But that doesn’t mean it’s fake.

As I stated earlier, I consider the difference between 40 and 60 Fps negligible. Your example points to the extreme and 60 Fps is hardly extreme. I started this thread looking for some input on a good, reliable budget card. I have my reservations as to whether I could distinguish the difference as I have made no effort to train myself to do this. That said given the context in which I made my statement I wouldn’t consider my communication to be entirely inaccurate and I wonder how many have gone through the effort to train their brains to visually differentiate between the two frame rates. I certainly wouldn’t besmirch anyone’s artistic palette as I respect art and the artist. Even black and white photographs have a place in the realm of art. Certainly there is no comparison between B&W 801 Matrix speakers, for example, and a set of Beats headphones. (Give me the 801’s any day of the week and a good Nakamichi on a high end Macintosh amp and I’ll be happy.) But comparing the audio to the visual is tantamount to comparing apples and oranges IMHO. Howbeit these are extremes and those high performance users who have the coin to pay for the extra 5% (or less) improvement over what they already own represent the exception. Clearly the exception only proves the rule. As a rule, this hype about frame rates is negligible, especially when one is in the market for a budget graphics card as I was. Yet, to be completely fair, I must concede that the GTX 1070 (or the 1080) is not exactly regarded as a “budget” card. I am confident that it will be soon enough. So I will chalk up my purchase to a little “future proofing”. :sunglasses:

I am a photographer, I know exactly what difference a high quality and color accurate monitor can make. I’m also a gamer and for a long time I have been gaming on 60Hz. Today I am running 144Hz TN for games. To say the eye can only see 40Hz is wrong. Same goes for 60. There is a reason why VR headsets are aiming for 90Hz and above in any fast moving game.

Is there a big difference between 60 and 90/100? Yes! Absolutely. It feels completely different and is less stress on the eyes.

So as a photographer and a long-term gamer would you say that your eyes and brain are trained to notice the distinction between 40 and 60 fps?

" Gamers… [are] a really weird population of people who are probably operating close to maximal levels [of vision]. "
Assistant professor Jordan DeLong

Definitely this. I can tell the difference between 24, 30, and 60 fps, but I have preferences. When it comes to film, I prefer 24 fps. When you make a film in 30 fps, it just looks and feels too real. Everything is too smooth. I can’t imagine what a 60 fps film looks like.

I remember I was in Best Buy a long time ago when Blu-ray first came out, and one of the TVs on display in the front was running Pirates of the Caribbean. It felt like I was on set! That was way too real for me for a movie. I’m not sure if it was Blu-ray or whatever technology the TV was using, but I found it incredibly unsettling.

I’ve also seen stuff in 60 fps and it is super smooth. I think it’s good for gaming and stuff where you want to feel like you’re actually in the scene, such as nature documentaries and sporting events. In the future, frame rate is going to be an artistic choice, just as black and white photography is today in film or visual styles are in animation.

Anyone will spot that a game feels and looks more fluid. Yes, of course you can see that difference.
Here, leave it in windowed, watch at 720p60 and then pull it down to 480p and back.

I think this article I posted earlier addresses it best

What we really know

After all of that, what do we really know? That the brain is complicated, and that there’s truly no universal answer that applies to everyone.

Some people can perceive the flicker in a 50 or 60 Hz light source. Higher refresh rates reduce perceptible flicker.
We detect motion better at the periphery of our vision.
The way we perceive the flash of an image is different than how we perceive constant motion.
Gamers are more likely to have some of the most sensitive, trained eyes when it comes to perceiving changes in imagery.
Just because we can perceive the difference between framerates doesn't necessarily mean that perception impacts our reaction time.

So it’s not a tidy subject, and on top of all of this, we have to also consider whether our monitors are actually capable of outputting images at these high framerates. Many don’t go above 60 Hz, and Busey questions whether monitors advertised at 120 Hz really display that fast (according to some seriously in-depth testing at TFTCentral, they certainly do). And as someone who has also enjoyed games at the 30 frames per second (and often rather less) rendered by my consoles, I can relate to them suggesting that other aspects of visual displays might connect better with my visual perception.

On the other hand, I would love to hear from pro teams about their objective experiences with framerate and how it affects player performance. Perhaps they’ll corroborate or contradict science’s current thinking in this field. If gamers are so special when it comes to vision, perhaps we should be the ones to spearhead a new understanding of it.

Can you see a difference in the example I posted above?

I never really bothered actually. I would likely notice some difference with my “trained eyes” and photographic experience but I like my settings on my card this o/s and don’t want to mess with them. I’ll trust what I read in the article. Based on past experience I think the article is accurate.

So if the experiment does not deliver the result you want, just ignore it?

Confirmation bias in a nutshell.

You’re assuming I want a particular result. I don’t. I still think 60 fps is good enough for me and I have a graphics card that will deliver considerably more than that. :slight_smile: