Power supplies

Cobra

Active User
H-M Supporter Gold Member
Joined
Apr 24, 2013
Messages
709
Can two 500 watt power supplies be attached to provide 1000 watts to an inverter?
 
Maybe. It would depend on how well matched the power supplies are, I would expect the one of the power supplies would supply the bulk of the load with the other trying to make up the difference. I don't think there is a way to really get them in balance.

What hardware are you proposing to use and what is the end use?
 
I have a construction grade generator that works well for lights and the like in a power outage but does not have clean enough power to operate the furnace or the gas oven. I want to run the generator through the power supply and then through an inverter I have to provide power for the sensitive items.
 
I would buy a couple of deep cycle batteries and a battery charger. Plug the battery charger into the generator. I'm surprised that the generator puts out such dirty power, it must have a really crappy inverter in it.
 
Probably not an inverter type generator. Assuming your furnace and stove are both gas powered the control circuitry shouldn't draw much power at all. I'd just buy a computer UPS and wire that up for them.

We use these units for our network switches and have had good luck with them in that application.


John
 
Unfortunately , because the motor and controller are on the same circuit ,it is more straightforward to handle both. Thus the reason for enough wattage to handle the motor as well.
 
Ups units come in higher wattages, just more money ;)
 
What kind of power supplies?

If fairly close in characteristics, a small resistance in series could be added to help balance the current draw. If a resistance value R is placed in series with each power supply, For output voltages V1 and V2 at any given time and current draw i, the difference in current, i2 - i1, will be (V2-V1)/R and i = i2 + i1. You would want to choose the resistors such that maximum current draw allowed for either supply isn't exceeded at full load.

Any d.c. power supply can be described as a battery in series with a resistance. The resistance represents the drop in voltage as current is increased. Switching power supplies tend to have low series resistance compared to linear supplies.

The disadvantage of adding resistance is that it decrease the available voltage at full load and decreases the efficiency of the power supply. The feedback loop could be modified to place the resistors within the loop which will give better regulation but that can be tricky.
 
At high enough currents your balancing "resistors" could literally be lengths of wire or an extension cord
M
 
A 12 volt input inverter drawing 1000 watts would require around 90 amps at minimum input voltage.. The 500 watt supplies outputting 12 volts would be capable of furnishing 42 amps each. Dropping the output voltage 1 volt via the balancing resistors at that current would require a .023 ohm resistance. 16 AWG wire has a resistance of .004 ohms/ft. so 5 ft. of 16AWG would suffice.

If there was .1 volt difference between the two supplies, the difference in current would be about 4 amps, i.e. 44 amps from one and 40 from the other. This is probably within the tolerance for the power supply. A higher voltage difference will increase the current difference. Likewise, a power supply putting out 15 volts feeding a minimum input voltage of 11 volts for the inverter will allow a 4x increase in R and a 4x voltage difference. Increasing the power rating on the two supplies will also permit a greater voltage variation. I would probably look at using 600 watt supplies. At a 50 amp output, this would allow for a 10 amp difference.

Switching power supplies are usually user adjustable for output voltage so two can be set to matching voltages. They are also very good at maintaining a constant output voltage over varying output loads and input voltage variation. Linear supplies not so much.
 
Back
Top