Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Volts vs Amps for power supply specs. What's the importance of each to power a v


  • Please log in to reply
1 reply to this topic

#1 peterk312

peterk312

  • Members
  • 95 posts
  • OFFLINE
  •  
  • Local time:12:51 PM

Posted 27 January 2014 - 06:11 PM

Trying to understand the volts required for a specific AGP video card to see if my existing power supply might be too low to run the card. There are 2 cards that I'm interested in: a GeForce FX 5500, and a GeForce FX 6200. I'm not so interested in the recommended watts for a power supply unit. When I go to the specs for these two cards, it says:
 
"Minimum recommended power supply with +12 Volt current rating of 18 Amp."

I know the specs for the existing power supply in my computer indicates the following:

+3.3V@15A
+5V@11A
+12V@5A
+12.8V@7.5A
-12V@0.15A
+5VSB@3A

If I don't have a power supply that indicates at least a +12V current rating of 18 amps, the video card won't work? What is the real significance of recommended amps?

I'm concerned for a few reasons, but mostly because I have a proprietary power supply unit (non-standard in size and volts) that will be very tough to simply upgrade.



BC AdBot (Login to Remove)

 


#2 jonuk76

jonuk76

  • Members
  • 2,178 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Wales, UK
  • Local time:08:51 PM

Posted 28 January 2014 - 01:59 PM

Amps x Volts = Watts (I don't mean to teach you to suck eggs but it's worth re-iterating). 

 

Modern PC's require a lot of their power on the 12v rail.  For example, ALL of the power for the CPU is provided on the 12v rail in anything from the Pentium 4 onwards (or any CPU that uses a seperate 4/8 pin ATX12V plug).  A decent explanation of why this is the case now, but wasn't in very old PC's can be found here - http://en.wikipedia.org/wiki/Power_supply_unit_%28computer%29#ATX12V_standard  Video cards also get most of their power from the 12v rail.  For example, a video card that draws 240w at maximum load will be drawing 20A on the 12v rail.  A 95w CPU will be drawing around 8A on the 12v rail, and so on.  A hard disk typically draws around 1A on the 12v rail.

 

As for the 12.8v rail, that is a new one on me :)  It's not part of the ATX standard, looks like a proprietary PSU used in old Dell and Compaq PC's from what I can find.


7sbvuf-6.png





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users