Jump to content
Posted 25 May 2007 - 10:11 PM
Posted 25 May 2007 - 11:01 PM
Posted 26 May 2007 - 11:51 AM
Posted 26 May 2007 - 01:33 PM
I have an older pc and had to replace the video card. Because of the age of the pc I went with a Nvidia GeForce FX5500 (that's the most current card my motherboard can handle). Everything is running fine except for World of Warcraft. I am getting low frame rates and occasionally the graphics will break up. After getting some help from tech service they suggested that my video card may be overheating.
Here's my question: Is there some way I can easily check the temperature of the card and what ball park temperature range would be considered ok? I'd like to be able to confirm what Blizzard is suggesting before I call the card vendor to complain. There are no temp sensors on the card or anywhere in the pc (remember - old pc. lol). Also, the card has a heat sink instead of a fan. Could I possibly use one of those infrared thermometers and if so what do I point it at on the card?
Thanks in advance.
Posted 26 May 2007 - 02:05 PM
Posted 26 May 2007 - 03:13 PM
Posted 26 May 2007 - 03:28 PM
Edited by Sneakycyber, 26 May 2007 - 03:29 PM.
Posted 27 May 2007 - 07:23 PM
Posted 27 May 2007 - 08:17 PM
The vendors pay to use the chipset and technology but the circuitry they design. So, yes there can be a difference
I hope this isn't too dumb a question, but is it possible that there could be a difference between two vendor's cards if they are both AGP, FX5500 with 256mb? In other words two cards with the same specs but from different vendors?
Posted 28 May 2007 - 12:54 PM
Posted 29 May 2007 - 06:40 PM
And when you think about it, it's really kind of sad that I'm going through all this for a game. Oh well....
0 members, 0 guests, 0 anonymous users