First, check your supply voltage at various times of the day and see what you REALLY have at the plug. You may think you have 220V, but odds are it's something else.
Next, check with the welder manufacturer to see what the voltage tolerances are for "normal" operation. Typically, it's a 5% drop from the nominal voltage. If your supply voltage is higher than nominal, you have a lot of room to work with. For example, if you have a nominal voltage requirement of 220V for the welder, you can drop to 209 Volts at the welder and still be within spec. If your voltage supplied at the plug is 235 that gives you even more room to work with.
Next, go to one of the online voltage drop calculators, and using the maximum current draw of your welder (from the plug, NOT what you are using from the welder), and see what you will get for voltage drop for a given supply voltage, load, wire size, and length of wire. With that information, you will know just how long an extension cord you can get away with. Make sure your house/shop wiring is using good code-sized wiring to your outlet, and it isn't too long of a supply run, or you'll have to figure that part out as well.
I have a 210 Amp mig welder that draws around 25 Amps from the line when it's running wide open. I can run a 100-ft extension cord with #10 wire and be well within spec for the welder, even at "full throttle."
Cheapest way to build a welder extension cord is to buy a #10 wire 120V cord, cut off the plugs, and then rewire the ends to the appropriate NEMA plugs. For shorter cords, you can get away with #12 wire extension cords and still have acceptable voltage. If you really want to be careful, you can put a voltmeter on the connection at the welder and monitor how much it drops when you put it under load. If you never use the maximum amperage, you have even more margin to work with.