Since it seems that we are reaching the physical threshold on current technology as far as processing nodes go, do you think that we will see enough gains in the next 20 years or so to keep us excited enough about compute power to continue with yearly upgrades, or will we start to see high end rigs that last 4-5 years or more?
Working with silicon in a laboratory for years I've learned quite a lot about the integrity of electronic mediums. There is an electrical process called electromigration, in which ions in a conductor drift over time and cause circuit paths in electronic components to degrade. This may cause signal paths to slow down due to increased thermal noise in the signal paths, not just for silicon but copper as well. The effect can be escalated by heat, and thus hot/overclocked systems can suffer more from this phenomenon.
As heatsinks and fans get clogged with dust, their cooling efficiency goes down and hotter digital electronics run slower. This is especially true of processors and RAM so keeping things clean and re-greasing your CPU heatsink for maximum heat transfer can help. Faster, "closer" electrical circuitry within new electronic components means that proper maintenance is ideal for keeping your computer lasting many years. Older electronics (80s/90s era) can endure neglect due to lower frequency and wider material gaps between the circuits.
A good way to answer your question is to imagine that as the manufacturing process becomes smaller, the TDP becomes the most important aspect in determining the lifespan of the component. Electronic architecture needs to consist of smaller transistors, lower-voltage structures, and clock gating to reduce energy consumption. I would predict that in 20 years transistors will become an obsolete medium for computing, a more direct interpolation such as quantum computing can withstand the increasing frequencies and make computing faster without the phenomenon of electromigration.