The thinking here is that, because a welder is seldom (if ever) designed for 100% duty cycle at its highest rated amperage, you don't need to run heavy enough cable to handle 100% duty cycle at that amperage. For example, if you were going to run a 50-amp electric motor on a given circuit, you would need to use #6 wire because the motor would be expected to be run continuously. But for a 50-amp-draw welder (say, output amperage between 40-140 amps), it may only have 20% duty cycle at any output over, say 100 amps. So there's no expectation that it would ever actually draw 50 amps for any significant length of time. This means that the wire would have time to cool down.
The relevant table of the NEC is 630-11(a). That table gives a derating percentage for each duty cycle percentage. For example (I am just making up these numbers), at a duty cycle of 20%, you might be allowed to derate the max amperage to 60%. So if your welder drew 50 amps max, which would normally require a #6 wire, you would multiply that by 0.6 to get 30 amps, and put in the equivalent wire gauge to handle 30 amps. You would still use a 50 amp breaker, however.
Unfortunately, I cannot for the life of me find a copy of 630-11(a) online, so I can't actually give you the multipliers for the duty-cycle percentages.
Again: this is only for dedicated circuits that only the welder will ever be used on.