Ahhh, maybe you were thinking that for any given level of POWER there is an inverse relationship between voltage and current ?
Yes, for whatever power is needed OUT of a device it will consume less current from a 220 volt source than from a 110 volt source.
Clearer ?
This is where current affects cost.
Say there is some design arithmetic that determines that the input power will need to be 6,600 Watts (arbitrary, but reasonable).
That could be "Sucked out of the wall" as 6600/220 = 30 amps or as 660/110 = 60 amps.
In one case you need only half the current rating for the primary side that you need for the other case, i.e. thinner wires, cheaper plugs, switches, etc.
Theoretically you need more/better insulation for higher voltages, but the difference between insulating 220 and 110 is negligible.
Now this clears up my confusion! I think I understand better the relationship between voltage, amperage, and output when it comes to using a cutter or welder. Thanks, Reg! And everyone else who contributed :thumbsup: