rus geek: If I understand you correctally, you are saying that 40amp on a 220v circuit would work if under 84feet????
Don't get me wrong, I'm not trying to suggest to anyone to do anything unsafe or not to code.
But, the question I have never got a straight answer to is why wires are rated in amperage and not watts??? Maybe theres an electrical engineer on here who could help us out.
It is my understanding of electricity that watts are what does the work/creates the heat. Wires are rated on how much current they can carry without getting to hot. Heat being the limiting factor. They have to stay under a certain temp to keep the insulation intact. Watts are what creates the heat. More watts =more heat.
The widely used amperage ratings of common wire sizes kinda assume a set voltage. Lower the voltage and it should be able to carry more amps right?? Given the automotive 12v systems as an example.
Doing some searching I found an interesting artical that kinda sums up the mind boggling questions that I have.
Wire Capacity Chart
And as far as having 2 hot wires on 220v to "share the load" not being an acceptabe argument puzzles me too.
Again, I'm not trying to be argumentative or unsafe, I just don't understand the logic of the code. If someone could shed a little light on it for me please am all ears.
BTW My powercord for the welder is not tucked behind any walls, and I don't feel the least bit unsafe running my welder to max with it's 12ga powercord plugged into a 40amp breaker. Call me crazy
