How does a power supply work?
- From: TVeblen <Killtherobots@xxxxxxx>
- Date: Sun, 18 Apr 2010 07:12:34 -0400
I'm trying to figure out how the math works on a power supply's ratings regarding wattage and current. Using my 750TX power supply's specs as an example, here are the listed ratings and my math (V x A = W in parentheses).
Input: 115v US x 10A = (1150 Watts)
-3.3v @ 24A = (79.2W)
+5v @ 28A = (140W)
Total Above: (219.2W) Combined rating in spec = 180W
12v @ 60A = (720W) 720W in spec
-12v @ .8A = (9.6W) 9.6W in spec
5vsb @ 3A = (15W) 15W in spec
Total all the spec Watts: (924.6W) for a 750W PS.
So, in layman's terms, how does this math work?
I can understand a loss of power between input and output due to conversion, but not understanding why the math does not add up on the output side.
Specifically, if someone has a 360W power supply and is trying to run a high end video card, could you really expect it to reliably supply 12v at 30A, or is this output going to be reduced by the draw on the other rails? And how could a layperson determine what amperage to reasonably expect from that power supply for the video card?
- Prev by Date: Re: Looking for fairly quiet case
- Next by Date: Re: How does a power supply work?
- Previous by thread: I just tried to install XP on an HP ME machine
- Next by thread: Re: How does a power supply work?