I recently built up about a $1000 gaming pc with an EVGA Geforce 1060 6GB. It runs any of the modern games I've thrown at it with ease. (Probably because I'm running it on an ancient 1680x1050 60 FPS monitor) If I turn Vsync off, it'll run Bioshock infinite at 200+ FPS with ultra settings. (Unless Elizabeth (your partner throughout the game) is in the scene, then it's down to 120-150 FPS or so.) Anyway, I recently started playing Crysis 3. I know the crysis games are hard to run, but I didn't expect this. I get frame drops down to 30 FPS when on ultra settings, but the weird thing is that the GPU utilization hovers around 70%, whereas in Bioshock infinite it's pinned at 100% when Vsync is off.
Why would the game not be utilizing the entire GPU power if it can't produce enough frames? The computer isn't hot... afterburner tells me the graphics card temp is 64 °C and the CPU is ~45-50 °C. I have a few thermistors in the mail so I can program the fans based on the graphics card's temp.
Is this an indication of poor game programming?
My secondary question is this. It's not necessarily a bad thing, but my graphics card doesn't really heat up, even under 100% load. Watching other people benchmark cards and coolers online, temps in the upper 70s and even 80s °C are normal, where I've never seen mine above 64 °C even when running it pinned at 100% for a few hours. Is that normal? It's certainly not bad, I'm just wondering why my card doesn't heat up? I mean sure, I built the computer to run cool (Full tower, 3x 140mm case fan, 2x 120 mm fans + power supply and GPU and CPU fans and programmed the GPU fan to turn on earlier and faster than stock), but I didn't imagine it'd run THIS cool with so little tweaking.
EDIT: Apparently this card runs cool stock, people over at guru3d tested a few other 1060s and they all ran at about 69 °C at 100% load. So... overclocking?!?!
Edited by corrado33, 29 September 2016 - 01:42 PM.