How are clock speeds read?

I'm curious about this because every different gpu overclocking software is giving me different speeds on my gpu's clocks..... I don't know which to go by. For example, I use open hardware monitor 24/7 on my fourth monitor so I can always see what my speeds and temps and stuff are, but I notice that it would say my 760's run at, for instance, 1254 mhz for both. Then Gigabyte's V-Tuner says it runs at 1000 mhz and msi afterburner is more like 12xx and even unigine Valley's benchmark simply gives me 1202 on my clocks. I've been tweaking with this for awhile, and I need to know which I can trust for my actual clocks... For example, here is a screenshot with unigine, gigabyte, and ohw's different clock on the same gpu's...

Personally, I'd use MSI afterburner or GPU-z. A lot of GPU monitoring software, such as EVGA Precision X, HIS iTurbo, or MSI Afterburner, and I suspect the Gigabyte's V-Tuner as well because no need to reinvent the wheel, all use RivaTuner, a tool developed specifically for tweaking Nvidia GPU's since 1997. Today GPU clocks aren't static, so unless all of them are giving you different readings at the exact same time, they may be all be correct. They do have a base speed, but can and do dynamically change the clock speed depending on a variety of factors, including voltage, power consumption, heat output, load, etc. Especially with after market/3rd party cooling, the GPU's may run in a boosted state for extended periods of time. My 760 ACX superclocked was running at 1202MHz on a nearly constant basis in all but the most demanding games, and it would fluctuate a lot in the 1100MHz range. There is little to no demand, like when you're playing an old game, it may be down to 640MHz (or something like that), which I've seen when playing older games.

I'm also curious about this. In RivaTuner or other monitoring software what would the sampling rate be, 2*freq (nyquist) about 2400 MHz in the case of @urmomrules ? If that is the situation could you have a cpu with a lower clock speed sample results?
Or on the other hand are the GPU's/software reporting the average clock speeds in a given interval (ex. 1 sec)?
I'm guessing since the number isn't critical, the means to report it don't have much fidelity.

Sorry for my late response, but I've been uber busy this last week. Anyway, I'm not sure what you're asking. I've been trying to actually underclock my gpu's so they would run slower, but i have literally no idea what to go by. Also, recently I've been getting some random artifacts on the screen for a split second, maybe once or twice an hour. I don't know if one of my cards are going bad or what. And in openhardware monitor, the clocks are 1/2 of what they are (or so i suspect) on the chart, but are accurate on the tab it has. I've never liked Gigabyte's in-house V-tuner, and have used MSI afterburner for most of everything I do. I just thought it was very odd that Heaven had my gpu's clocked at 1202 each, ohw at 1124 and 1137 respectively, v-tuner at 900 and 913 respectively, and MSI afterburner, offscreen, at some other clock. I'll try again on sunday and see what I get. I haven't even thought about using GPU-Z, and that's what I first started with way back when lol.

I just tested really quickly with GPUz as well. Here's the pic of that

So Heaven shows 1272 each. In gigabyte's Vtuner i have 1050 each SET for the clock speeds. I have disabled MSI afterburner so there's no crossclocking. Notice though, that GPU-Z and OHM both have the same readings. 1202 for one clock and 1189 for the other (you can't see the second one, thought for gpu-z, but trust me, they read the same). So now I believe that openhardwaremonitor and gpu-z are the accurate ones. I don't know how V-tuner does it's clocks. I'm going to switch and reboot and test with Afterburner now.

Hmm. MSI, GPUz, OHM all read the exact same. 1254 for one, 1241 for the other. So why is Heaven reading 1326 and V-tuner reading whatever it reads?

Shit, sorry. forgot to quick crop lol