Increasing amps

   / Increasing amps #11  
Plug it into a 20 amp circuit and turn it all the way up. If you can run a bead without popping the breaker, there's no advantage. If you pop the breaker, then there's an advantage.

NEXT!

Cant hurt. If you have the 30amp already available, I'd use it. That 30amp should be wired with at least 10ga wire which will have much less voltage drop and allow more current draw than the 15amp or 20amp which will be wired with 14ga or 12ga respectively.
Yes. Regardless of a 20 A breaker holding, the performancece advantage is with the higher amp circuit due less voltage loss in the larger wiring.
larry
 
   / Increasing amps #12  
Yes. Regardless of a 20 A breaker holding, the performancece advantage is with the higher amp circuit due less voltage loss in the larger wiring.
larry

Help me out with that, if you would? My thinking would be that a given load is going to pull a given number of watts, and that is constant. If voltage is lower, amps will go up (and so will heat in the wire), but the wattage will always be the same. So take a welder that is operating at max output on 110v current and it pulls 19 amps = 2090 watts. Now it's on a circuit with less voltage drop and it's getting 120v. It pulls 17 amps instead of 19, but otherwise, it performs exactly the same, doesn't it?
 
   / Increasing amps #13  
Nope - the larger supply wire size causes less drop in supply voltage - having a higher voltage available to the welder means that, for any given electrical resistance/impedance, a larger current can flow.

This is ignoring any change in the welder's input resistance due to the possibility of more heating ( resistance of all metals varies with temperature - some have positive temp coefficient, some negative. As in, warmer could mean either more or less resistance, which would also enter in.

But basically, if you have more voltage available at the input, you get more current out... Steve

Temperature Coefficient of Copper
 
   / Increasing amps #14  
Help me out with that, if you would? My thinking would be that a given load is going to pull a given number of watts, and that is constant. If voltage is lower, amps will go up (and so will heat in the wire), but the wattage will always be the same. So take a welder that is operating at max output on 110v current and it pulls 19 amps = 2090 watts. Now it's on a circuit with less voltage drop and it's getting 120v. It pulls 17 amps instead of 19, but otherwise, it performs exactly the same, doesn't it?

It needs to be fed enough volts and amps in the first place. If you have too much voltage drop and/or too small a wire its like drinking a shake through a stirring stick..
 
   / Increasing amps #15  
Yes. Regardless of a 20 A breaker holding, the performancece advantage is with the higher amp circuit due less voltage loss in the larger wiring.
larry

Help me out with that, if you would? My thinking would be that a given load is going to pull a given number of watts, and that is constant. If voltage is lower, amps will go up (and so will heat in the wire), but the wattage will always be the same. So take a welder that is operating at max output on 110v current and it pulls 19 amps = 2090 watts. Now it's on a circuit with less voltage drop and it's getting 120v. It pulls 17 amps instead of 19, but otherwise, it performs exactly the same, doesn't it?

Nope - the larger supply wire size causes less drop in supply voltage - having a higher voltage available to the welder means that, for any given electrical resistance/impedance, a larger current can flow.

This is ignoring any change in the welder's input resistance due to the possibility of more heating ( resistance of all metals varies with temperature - some have positive temp coefficient, some negative. As in, warmer could mean either more or less resistance, which would also enter in.

But basically, if you have more voltage available at the input, you get more current out... Steve

Temperature Coefficient of Copper

It needs to be fed enough volts and amps in the first place. If you have too much voltage drop and/or too small a wire its like drinking a shake through a stirring stick..
I appreciate your question ... and these good answers. I have very little to add, and some may be confusing. Briefly:

,,,The small wires drop more V @ any current and thereby consume power within themselves.

,,,Welders differ. With electronics deriving the welding current a welder can pin its output just fine with variable input V [within reason], but must pull greater current to do it as the voltage drops. Big wire is always your friend in any circuit, but one with an electronically pinned output will drive this home. - The welder puts out the V X A set regardless of the input. Factor this against a 20A breaker on 12Ga wire vs a 20A breaker on 10Ga and you see the welder will pop the 12Ga breaker before the identical 10 Ga ... because the welder draws more A on the 12 Ga feed.

,,, Other welders can be just a variable transformer. These putout lower V as the input V goes down. Altho V can drop a bit before you notice a performance change, its not too hard to see that yould like voltage to be nominal so theyld work as well as they can.

Stay as close to the power panel as feasible. Use oversized wire if you have to get very far away. Remember how that circular saw acts at the end of a 12 vs 14Ga 100' extension cord. same idea.
larry
 
   / Increasing amps #16  
,,, Other welders can be just a variable transformer. These putout lower V as the input V goes down. Altho V can drop a bit before you notice a performance change, its not too hard to see that yould like voltage to be nominal so theyld work as well as they can.
larry

That's a good point. Just a few months welding on an inverter, and it's like I've completely forgotten transformers exist.
 
   / Increasing amps
  • Thread Starter
#17  
So for those saying it would, they would be thinking that the 20 amp is on 12 ga wire? Would it make any difference if both the 20 amp and the 30 amp are on 10 gauge? From what i'm interpreting, it doesn't.
 
   / Increasing amps #18  
Yes.
... And Barely. There is some loss in a breaker associated with energy to cause the physical "break". This is approximately the same regardless of the breakers current rating. V across a breaker at its rating may be 0.05V at a guess. A 30A at 20A draw would then have 2/3 x 0.05 ~0.033V across it. ... Kinda no difference. :confused3:
larry
 
   / Increasing amps #19  
A 20amp breaker is smaller and flows less with more voltage loss than a 30amp. Same idea as feeding a 1/2" spigot with a 3/4" hose - while you will have max water available, it will only flow as much as the 1/2" will allow.

But, as said, for a welder designed for 20amp, the difference would be very marginal.
 
 
Top