But, the question I have never got a straight answer to is why wires are rated in amperage and not watts??? Maybe theres an electrical engineer on here who could help us out.
++++++++++++++++++++++++++
Because watts delivered to the load do not only depend on the ampacity of the wire. The wire has a few ratings of interest to this discussion.
1. highest allowed voltage for the insulation to remain safe at the highest rated temperature.
2. staying within the limit in #1 above amps is what counts in sizing wire and only derivatively does watts delivered matter.
First the equations needed to make sense out of simple electrical stuff:
I = E / R ___ where:
I = current in amps,
E = electromotive force in Volts, and
R = resistance in ohms
P = I * E Power in watts is equal to the current in amps times the voltage.
Example: If you were to use 600 volts to power a circuit wired with a wire safely able to carry up to 40 amps you could deliver up to 24,000 watts.
Using the same wire with 12 volts you could safely deliver up to 480 watts.
The folks making and the folks selling the wire have no clue what voltage you will use. It makes sense that they tell you the amp capacity of the wire.
+++++++++++++++++++++++++++++++++++++
It is my understanding of electricity that watts are what does the work/creates the heat. Wires are rated on how much current they can carry without getting to hot. Heat being the limiting factor. They have to stay under a certain temp to keep the insulation intact. Watts are what creates the heat. More watts =more heat.
+++++++++
That is totally correct
++++++++++++++
The widely used amperage ratings of common wire sizes kinda assume a set voltage.
+++++++++++++
No, not at all.
+++++++++++
Lower the voltage and it should be able to carry more amps right?? Given the automotive 12v systems as an example.
+++++++++++++++
Absobloominglutely NOT RIGHT!!!
+++++++++++++++
And as far as having 2 hot wires on 220v to "share the load" not being an acceptabe argument puzzles me too.
+++++++++++++++++++++++++
In a 220/240 VAC circuit all the current flows in both wires in series. The chief difference is that a 240 volt circuit with a given size wire will deliver twice the watts of the same wire running 120 VAC.
The two wires do NOT SHARE the load such that each handles 1/2 the current. They both carry all the current. In a 120VAC circuit all the current passes through the hot wire AND the neutral in series. IN a 240VAC circuit the same thing happens. The wires are in series NOT PARALLEL!!
+++++++++++++++++++++++++++++
Again, I'm not trying to be argumentative or unsafe, I just don't understand the logic of the code. If someone could shed a little light on it for me please am all ears.
BTW My powercord for the welder is not tucked behind any walls, and I don't feel the least bit unsafe running my welder to max with it's 12ga powercord plugged into a 40amp breaker. Call me crazy