Today I had the pleasure of seeing a documentary about electrical current. Among many things, it offered a description on electricity flow and resistance. Resistance causes a conductive object to heat up and in the process, transfer electricity more slowly.
It's known that computer hardware heating is bad because it reduces the life time of components and they risk burning if not cooled properly. But this raised another question for me: Do heated circuits decrease performance by any noticeable amount due to resistance? For example, is a hot CPU slower and could result in visible lag and longer loading times of programs?
If anything I assume it would be a matter of miliseconds, since electricity doesn't travel that far throughout a PC. Then again, there are a lot of components that work on the principle of syncronization. If a signal gets there a milisecond later, it could cause other hardware to have to wait a few miliseconds more. Heat and resistence could make a difference there.
Were any benchmarks done for CPU's, memories, video cards and chipsets in this sense? And are there any certain facts on how much heat decreases performance? Personally I never got this impression, and my computer works as fast during winter (when my room is rather cold) and during summer when it's painfully hot, as far as I can tell.
Edited by Taoki, 20 June 2013 - 04:48 PM.