Ohm's Law and Gpu Lifespan While Mining?

To begin with can somebosy please explain ohm's law.

Another thing I would like to ask about is card life while mining and ohm's law.

Say I were mining bitcoin.

Does the load on the card affect the lifespan of the card, assuming that all other factors such as voltage, wattage and such are kept the same, the only variable being load?

Of couse there will be a variation in voltage and wattage used in idle and under full load by the card, but if we were to isolate it to load only i.e gpu usage and ram usage, will it affect the life of the card?

I wanna get more of an idea about what mining bitcoins is actually doing the life of my card.

 

 

Ohm's law states that the current through a conductor between two points is directly proportional to the potential difference across the two points. Meaning I = V/R. In otherwords it is how you calculate current flow when you know the resistance of a component and the voltage you are supplying it with.

Load does effect the lifespan of any componentry, the transistors naturally wear out after lots of use just like anything else.

(I'm not sure where I remember the following point from, if someone knows please cite it) The usual life expectancy of any silicon microprocessor is around 10 years at normal levels of usage. Mining is calculated to take off around 1-2 years of that life span provided the card stays within it's thermal thresholds and is not over volted.

I=V/R   I = current  v = voltage r = resiatance 

Ohms law Its basically a simple equation to find a value eg to find out the voltage it will be I x R 

To find the current its V divided by R

To find resiatance its V divided I

[img]http://www.hondaforeman.com/attachments/how/7309d1318956570-electrical-system-explained-how-500px-ohms-law-triangle.jpg[/img]

 

Here's a more in detail circle graph of ohm's law that I still use today in my electrical engineering career.

[IMG]http://www.rmcybernetics.com/images/main/pyhsics/ohms-watts_law.jpg[/IMG]

Won't bother explaining ohm's law since these fine gentlemen took care of it. But I will suggest that you look into Kirchhoff's law of current if you want to learn more about electricity, as well as electromigration if you want to learn more about the lifespans of processor cores and how different voltages/currents effect them.

EDIT: Here is the link to the electromigration article I read and it's very well in detail. Good stuff http://www.csl.mete.metu.edu.tr/Electromigration/emig.htm

The main thing for wear is temps. If you keep your GPU below 80 and run it 24/7, you might take a year or possibly 2 off of its 10 year lifespan. If you have higher temps though then you might lop off a little bit more (maybe 2-3 years). The thing is though, that doesnt really matter. In 10 years that GPU will be totlly obsolete anyway and won't even be worth anything. A $25 card of 10 years will shit on todays cards.

my 7870 (late Q3 2012) shits on my fried 8800GT (from early 2008). It was a full DX level up anyhow and 4 times as much onboard memory.

 

one of the main problems with GPU's is the stress that is added on the card and components from expanding and retracting by going from idle (low temp 35C) to high temp (85C) during gaming this happens a lot but to a lesser degree. As long as you maintain reasonable temps (75C) and those are constant, clock speeds matter little until you over-volt, then you have to take electromigration to the equation, but even then who in their right ming is still using a GPU from 2003?

 

It's not Ohm and Krichoff you need to be listening to for this it's Blacks Equation. http://en.wikipedia.org/wiki/Black%27s_equation

Temperature is the killer. The way I see it is high temps wear out your insulators causing a short circuit or the electromigration (from high current) gets great enough to accually break a connection.

Also your processor "clock" is a switch that essencially turns on and off very fast. For simplicity lets say your running 60hz and you o/c to 120hz, that switch will be in the on position the same amount of time per second just less time per tick. From this I say clock speed does not effect your cpu life because it is on the same amount of time. But I am not sure if there is an overcurrent in each tick which would cause more degragation...

thanks guys