OK, as I understand Ohm's Law (I = V/R), there's an inverse relationship between voltage and amperage. That is, the more volts, the less amps, and vice versa. Correct?
And a plasma cutter needs more amps to cut through thicker metal - correct? Ditto for a welder, more amps are needed to weld up thicker pieces of metal... correct?
So why is it that the heavy-duty plasma cutters and welders pretty much require a 240V power supply, since those units would then have lower amperage outputs than a light-duty 120V unit?
I must be missing something here
![Confused :confused: :confused:](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
Please point out my mistakes!
Interesting question...
(note:the higher the amperage, the higher the heat)
lets say you have a welder that has an input voltage requirement of 120 on a 30 amp circuit. As you increase the heat(turn up the welder to weld thicker materials) your output voltage will go down, but the amperage will go up. So at max settings, lets say it's 140 amps, your voltage will go down to as much as 15 or 20 volts. This is done by the transformer in the welder. What the transformer is doing (through electromagnetic fields in the windings) is converting the high voltage(120) into low voltage(15-20), and the low amperage circuit(30 amps) into a high amperage circuit(140). By using a machine that requires 240 volt input(24amp circuit) you can actually get more amperage because the higher voltage creates a stronger electromagnetic field in the transformers windings to convert to low voltage/higher amperage output. That 240 volt welder is now capable of puting out... say... 250 amps. again, since the amperage went up, what happens to the output voltage is it goes down. down to somewhere between 15 and 30 volts. As amperage goes up, so does heat. that's why welders are rated at amps. The higher the amperage, the thicker the material you can weld on a single pass. So... why do we have duty cycle?... well.. all that current flow(the high amps) generates a lot of heat, and that heat can melt the windings inside the transformer in your welder, so it has to shut off so that it can cool down(to protect itself). then, once it's cool, you can start welding again.
This is the reason power lines carry electricity in higher voltage than what's in our house. It then goes through a transformer so that it can be transformed into 120/240 for home use.
correct me if I'm wrong, but I believe it was Thomas Edison that wanted our powergrid to run off of dc current, and Nicholi Tesla found that a/c current was much more effecient, required less wiring, and was much more reliable than dc current.
Now, on to more confusing stuff
![Smile :) :)](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
An electric motor that has an input of 120 volts, with the same power as and electric motor that has an input of 240 volts, requires twice the aperage to do the same amount of work.
ex: a 2 horsepower motor=1492 watts
(100%effeciancy for ease of calculations)
a 120 volt 2 hp motor draws 12.4333 amps
1492watts/120v=12.4333(amps)
a 240 volt 2hp motor draws 6.21666 amps
1492/240=6.21666
now, even more confusing... in an electric motor, when mechanical resistance increases, so does amperage. So when these same electric motors are put under a load, the current in the circuit increases, and the voltage goes down. try: connect an inductive amp probe to your cars battery and measure cranking amperage. try it again with the spark plugs out. the amperage will go down, because the mechanical resistance of the compression stroke won't be there.
hope I didn't bore you guys to death.
oh yeah
Hello, it's been a while
![Smile :) :)](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
-matt