Interesting question. It'd take 20,000 miles of wire to hear even a 0.1 second delay, so you'd never hear anything like the delays you get out of some wireless solutions, using a hardwired system. A small phase distortion at higher frequencies takes much less delay though, so let's run the quick numbers on that:
If we assume you might hear phase distortion when one speaker is 10% (i.e. 30°) out of phase with another, and we assume primary (non-harmonic) tonal content is mostly below 1 kHz (e.g. open high string on violin is 660 Hz), then a 10% phase shift would always require a phase difference between speakers of more than 1ms/10 = 0.1 ms.
Light travels at 983,571,056 ft per second, and maybe up to sqr(2) slower than that over PVC-insulated matched-impedance cabling. Let's say 700E6 ft/s.
So, a 0.1 ms delay would require 70,000 feet of wire length difference between two speakers, to even hear it at 1 kHz. Longer for lower frequencies, e.g. A = 400 Hz above middle C yields 174,000 ft of cable length difference. Harmonics might be more easily distorted, at say 7000 feet of wire, but my system is less than 5% of that length... and how many can really even identify harmonic distortion by ear?
My speakers are all on cable runs of maybe 60 to 200 feet, so no problems with delay. Our bigger issue is actually attenuation causing differences in the volume projected from different speakers on the same volume control, due to long cable lengths. Speakers are a relatively low impedance, and thus draw a lot of current from a small voltage.
That big fat Monster Cable guys use in 10 foot lengths for their stereo speakers is a totally pointless marketing gimmick at 10 feet, but big O2-free wire gauges become very useful when trying to make your speakers sitting out at 200 feet project with similar volume to those at just 40 or 60 feet. If you've ever wondered why they make high-impedance speakers (16 ohms and more), this is probably one of the reasons.