Have you tried using a different program to monitor GPU usage? Maybe try out MSI Afterburner or something to confirm whether or not EVGA Precision X is displaying the correct numbers. If this isn't the problem, then it'll at least knock out one variable.
If you have a decent aftermarket CPU cooler, you can also try overclocking your CPU to see if you gain any improvements in FPS or GPU usage. Although, I'm not sure if that would be possible with your particular mobo.
look dude, everyone is saying that your cpu is crap and that's why your video card can only be utilized at 40% capacity. you're running 667mhz memory, compare that to the effective 5000mhz memory on your video card also compared to the top cpu memory speed now of 3000+mhz. how does that not make sense. should i show you a picture of an hour glass? your processor is almost 6 years old for christ sakes. stop looking for any other reason other than what's the reality.
ok well tell me this then. how come at the character selection screen, or loading into a BF3 game, or any loading screen the gpu usage goes up and my frame rates go to around 800+ and the temps start to go up. but then when I get into ANY game.. and read that clearly now.....ANY game regardless of what im doing it sits at 40%.. how could my cpu be the problem if it lets it go up during that time. but then goes back down in-game to exactly 39-40% its one of those 2 numbers 39 or 40..and yea i got vista 32 but thats only a memory limitation "DUDE" lol..all you have to say is that my hardware is shity. wich i understand. but before i go spend 600 bucks on new hardware. id like to know that this thing is gonna pump out more than 40% in game
if you could give me some kind of math as to why 2.4 ghz only gives you 40% of 1054 mhz from a gpu. then i would be happy with what i have read so far.... but all you have for me is . "you have a cpu bottleneck"
when there are tons of people out there with cpus that run at 4ghz or higher that are having the same problem
and btw... speeds of ram dont really matter in games. and vista 32 to windows 64 would only give me MABY 5% increase from what i have read. unless you can tell me theres a reason that.."because its 32 bit" its only allowing a certain amount of power to pump through the card.
wich i dont think is the case since my cards is running over 3k mhz memory all the time. and is displaying over 1050 mhz all the time but with 40% usage.
the clock speed never moves as a result of preffered performance option i have enabled in games. but the memory does fluctuate but stay over 3k and this is with no overclock
When a PC is in a loading screen, it has very little to do, afterall its just a image with maybe a few moving lines.
When your in a game tho the amount the CPU is being asked to do is far far greater and therefore the speed in which it does calculations slows down, more data = more ques etc.
It does seem that your CPU is the bottleneck with the information youve provided. Bar making sure youve turned vsync OFF globally and ingame along with all the adaptive settings globally, there really isnt much more to be done.
You can always download http://www.ozone3d.net/benchmarks/fur/ and run that and see what your GPU does, if it stick at 40% Load in that then perhaps its a software issue or something but if it runs at above 40% your CPU will be the bottleneck.
Ill give it a go to explain to you, your looking at the numbers when unfortunately games do not work like that. (im sure people will correct me if im wrong but this is how i understand it currently)
Right your CPU has 4 cores (threads), most games about atm accept really new titles generally use 2 threads max which is why CPU usage rarely goes above 60% in a game (allowing some % for windows functions).
What this means is that your CPU will only ever use 2 cores to process information for the game your playing so instantly your CPU is handicapped. Ontop of this your CPU does its calculations mostly in sequence ie one calculation first followed by another and another, (this is how games are usually coded to work), now this method works fine for simple tasks such as web pages and documents but games on the other hand require millions of the calculations to be done in quick succession, this then puts those 2 threads (cores) under huge pressure to do them all fast enough.
More modern CPU's have developed fancy ways to handle this task, some have more powerful cores and some have virtual cores that kick in when needed, your older CPU however is just stuck with single cores and old techniques of processing info so when its cores are running at max thats it, thats when you get a bottleneck.
The way your looking at the data doesnt really fit with how it works, its not about the numbers its about the volume, GPU's handle their data differently from CPU's they handle data in a very parallel way, they dont handle data one calculation after another, they act more like torrent programs they do whatever they can whenever they can then package it all up and send the finished job off.
Yeah i really dont know what im saying now, ive lost myself :D, im sure somone will correct my errors but the simple fact is your GPU is a 20+ lane super highway whilst your CPU is a 2 lane 1 speed 1 direction b road.
I think the 760 is a good choice, but you might want to upgrade your monitor to 1080p. If you're only gaming, grab an i5 instead of the i7. That should free up some budget.
yea i did a bit of research , and the only reason i would go i5 is becuase i heard that the hyper threading actually cuases performance drops in some games. would be a shame if 3 years from now it becomes somthing that is a must have lol... but yea
and i totally agree on the monitor really would like to game on 1080p
Better to get the i5. Because in three years, if hyper-threading was useful for gaming, you would probably be looking to upgrade at that time, replacing the i5. I think an i5 would last closer to 5 years, personally.
what cpu did you have at the time lol. cuase like i said adaptive mode is off in my settings i have it set to high performance in all my games. but i do keep it on adaptive for global so it dosnt run that hard when im not doing anything taxing.. but in monitoring it shows 1054 mhz but with 40% gpu usage. so maby somthing is wrong with my monitoring program (evga precision) gonna try out some different ones and see if i get the same results
for all i know the amount of clock speed might not have anything to do with the usage. or its broken.. one or the other. the mhz core clock stays at exactly 1054 all the time. and the memory clock will hover around the 3k mark. never under 3k but somtimes a bit more.
also i noticed that when i try to push the core clock higher. i get no results through my monitoring. unless i have adaptive mode on, in wich case. ( say if it was using 500mhz and i added 30 to it. it would be 530) like you can always see the extra 30 most of the time. i havn't messed with the core clock ever since i found the " prefer high performance option in the nvidia settings. cause i figure its already pushing high enough mhz , dont want to hurt it.
one thing i do know forsure though. i cant get the update that fixed the "test" function in evga precision because i have a 32 bit os. you need a 64 bit for that.. its the added benchmark that comes with the program.
I completely agree, I cant see how the CPU could possibly bottleneck your GPU too 40% on every game you play. As for your CPU being utter crap based on pcmark it is almost as good as my I5 750 (although mine is not stock frequency), but I find it hard to believe you cant get more than 25FPS at sub 1920x1080 on BF3. I am assuming it is some type of driver issue...
yea but see there are i5 cpus that have like 3.4 stock speeds that have 4cores 4 threads.. mine does as well. and then there is mhz or something i dunno. the newest cpus out there are really far greater than mine , i get that. but honestly .. i keep hearing all this about overclocking it would help.
and if what you say is true, " And i understand what you mean", but that would mean that going up a ghz wouldnt help me in this case. at all cause, i'm only getting around 50% usage from CPU. so making it go to 3.0ghz instead of 2.4 would maby bring that down to 45-30cpu usage?
and then still..i would get no performance boost because i have a 4 threaded cpu? lol what do you think all the people that are rocking the old i5s and i7s are doing. with their 4 core 4 thread cpus.
what i'm trying to get out of someone. is more than "ARGH its complicated.... your cpu is old"
i need something like "THIS cpu > ???? would allow you to get the extra usage becuase it has >???? while your cpu does not"
cause i know its not the 2.4 on it..when my CPU usage is at 50+ or less in every game i play. and again its 4cores 4 threads.
YES i have a 32 bit os
and YES i have 3 gigs of ram BECAUSE of the 32 bit os
and YES i have 667 speed dd2 ram because my system is a piece of shit and has a crappy Mobo that cant handle anything faster.
but the ram speed isnt whats causing it , it just cant be. if you have somthing to prove to my complete noob ass about ram and gaming. go for it. teach me cause out of the 2 weeks ive been reading forums . all i see is people get pissed at even the suggestion that it could be RAM as a problem.
the os i have is 32 bit and i might see a boost of 5% with a 64 bit outside of the ram increase
now the fact that i have a 5 year old mobo..if somone told me that the motherboard in a dell xps 420 cant push a 760 past 40% cause its a bad chipset. and has proof that would be fantastic. but i have no way of knowing
no i get great fps most of the time in bf3 even with 40%gpu cuase my cpu eats it alive... even with it being 2.4 ghz. thats what im trying to tell these kids lol
now World of warcraft on the other hand... in 25man heroic..now that is "im guessing " a cpu hog. cause i will go down to 10fps in that shit. and i took down all my addons deleted all the folders... all that shit. trust me i've done it all. i get like 20 -30 fps in 10man raids.
and sadly this is with ANY settings in game low or high. same gpu usage. its freakin crazy. i think not having 4 gigs of ram is probably hurting me ... but lets get real theres got to be a serious problem
and i dont think its driver cause ive tried the new one twice .... the beta driver.. and the old driver " wich i'm currently using.. and nothing is changing
Just saying if it is the cpu couldn't you overclock it a little and see if you get a performance bump because then you could tell definitively that its the cpu
old as in the one before these new beta drivers went out. and the beta driver couldnt even detect my monitor the first time i got it. so i had to turn off my computer and restart just to get back to windows. somhow it installed. played with it forawhile. and then after seeing no boost in performance. i went back to the old ones again
nope, OC is not an option . as a have a Dell xps 420 from awhile back.. the cooler on that thing sucks. and i cant use an aftermarket one cause the motherboard is proprietary
planning on upgrading soon though. i just want to know if my card is defected , so i can go get somthing that works till i DO upgrade. and everyone insists that it isnt my gpu lol