A lesson in voltage and amperage, please

   / A lesson in voltage and amperage, please #1  

Perplexed

Silver Member
Joined
Feb 28, 2011
Messages
147
Location
NE Oklahoma
OK, as I understand Ohm's Law (I = V/R), there's an inverse relationship between voltage and amperage. That is, the more volts, the less amps, and vice versa. Correct?

And a plasma cutter needs more amps to cut through thicker metal - correct? Ditto for a welder, more amps are needed to weld up thicker pieces of metal... correct?

So why is it that the heavy-duty plasma cutters and welders pretty much require a 240V power supply, since those units would then have lower amperage outputs than a light-duty 120V unit?

I must be missing something here :confused: Please point out my mistakes!
 
   / A lesson in voltage and amperage, please #2  
I think it's because if they did run at 120v, then they would require more amperage than a typical household circuit can supply. If a plasma cutter is rated for 240v at 30 amps input then most likely the amperage requirments would double if it were rated for 120V-120v at 60a amps and so on.

A commercial grade plasma cutter would run off of a 3 phase circuit.
 
   / A lesson in voltage and amperage, please #3  
Welders and Plasma cutters all use transformers to lower the OUTPUT voltage which greatly increases the amperage at the rod. Welders may have 30 open circuit volts which will decrease under load.
 
   / A lesson in voltage and amperage, please #4  
Also has to do with wire size. 240 has bigger wire than 110. If you're running equipment the wires would heat up.
 
   / A lesson in voltage and amperage, please #5  
OK, as I understand Ohm's Law (I = V/R), there's an inverse relationship between voltage and amperage. That is, the more volts, the less amps, and vice versa. Correct?

And a plasma cutter needs more amps to cut through thicker metal - correct? Ditto for a welder, more amps are needed to weld up thicker pieces of metal... correct?

So why is it that the heavy-duty plasma cutters and welders pretty much require a 240V power supply, since those units would then have lower amperage outputs than a light-duty 120V unit?

I must be missing something here :confused: Please point out my mistakes!

The inverse relationship is between amperage and resistance. Voltage and amperage have a direct correlation. Resistance being equal, more voltage means more amps and more amps means more voltage. Take a look at the equation and try some numbers. Start with R=1 for an easy view
 
   / A lesson in voltage and amperage, please
  • Thread Starter
#6  
Hmmm... OK then, here's data off the UL label on the motor of a table saw:

12.8A @115V or 6.4A at 230V

I had thought that meant the motor produced 12.8 amps at 115 (or 120) volts, and 6.4 amps at 230 (0r 240) volts. An inverse relationship. What am I missing this time?
 
   / A lesson in voltage and amperage, please #7  
Think power instead of ohms law. W = I * V
 
   / A lesson in voltage and amperage, please #8  
OK, as I understand Ohm's Law (I = V/R), there's an inverse relationship between voltage and amperage. That is, the more volts, the less amps, and vice versa. Correct?

And a plasma cutter needs more amps to cut through thicker metal - correct? Ditto for a welder, more amps are needed to weld up thicker pieces of metal... correct?

So why is it that the heavy-duty plasma cutters and welders pretty much require a 240V power supply, since those units would then have lower amperage outputs than a light-duty 120V unit?

I must be missing something here :confused: Please point out my mistakes!

No,
There is a DIRECT relationship between volts and amps, i.e. the higher the voltage the higher the current (through any given SAME resistance).

PART of the reasons for most plasma cutters being 220-240 volt is manufacturing costs and design simplicity.

Ahhh, maybe you were thinking that for any given level of POWER there is an inverse relationship between voltage and current ?
Yes, for whatever power is needed OUT of a device it will consume less current from a 220 volt source than from a 110 volt source.
Clearer ?

This is where current affects cost.
Say there is some design arithmetic that determines that the input power will need to be 6,600 Watts (arbitrary, but reasonable).
That could be "Sucked out of the wall" as 6600/220 = 30 amps or as 660/110 = 60 amps.

In one case you need only half the current rating for the primary side that you need for the other case, i.e. thinner wires, cheaper plugs, switches, etc.
Theoretically you need more/better insulation for higher voltages, but the difference between insulating 220 and 110 is negligible.
 
   / A lesson in voltage and amperage, please #9  
Hmmm... OK then, here's data off the UL label on the motor of a table saw:

12.8A @115V or 6.4A at 230V

I had thought that meant the motor produced 12.8 amps at 115 (or 120) volts, and 6.4 amps at 230 (0r 240) volts. An inverse relationship. What am I missing this time?

It is a 1472 Watt motor:

12.8A x 115V = 1472 Watts

6.4A x 230V = 1472 Watts
 
   / A lesson in voltage and amperage, please #10  
Hmmm... OK then, here's data off the UL label on the motor of a table saw:

12.8A @115V or 6.4A at 230V

I had thought that meant the motor produced 12.8 amps at 115 (or 120) volts, and 6.4 amps at 230 (0r 240) volts. An inverse relationship. What am I missing this time?

Nope,
It just means that for the POWER that it can DRAW (NOT produce) it can take it from the wall as 12.8 at 115 or as 6.4 at 230.
POWER is the PRODUCT (multiplication) of volts times amps.
1472 Watts in this case.
 
 
Top