Coupling capacitors in series between stages of an audio circuit generally have a large enough value to roll off starting below 20 Hz. Since little audio voltage is lost across a coupling capacitor at the higher audible frequencies, in theory their distortion should not be a factor. This is exactly what I set out to prove or disprove with my tests.
The coupling cap will create mid and low frequency distortion. The distortion increases as the frequency drops. I saw a graph in an audio engineering book. This is perceived by the ears as a fat bass heavy sound, which some like. Amps with single ended supplies need coupling caps to keep DC from the output.
The function of the output coupling capacitors is to keep the DC voltage from reaching the speaker. This is very common for solid state amplifiers that use a single supply rail for the output stage. The capacitance is high enough to couple all the audio frequencies to the loudspeaker. A smaller value capacitor would limit the low frequencies.
The main contributor to capacitor distortion is its voltage coefficient. This is a measure of how much the capacitance changes as the applied voltage varies. Ideally, a capacitor should have the same amount of capacitance no matter how much voltage is present across its terminals.
Seems like the ceramic caps certainly could have been picking up noise or creating a signal themselves due to vibration. Worth experimenting in the future. Has anybody ever tried to use a very large ceramic capacitor as a pickup, like a piezo mic?
One problem with capacitors is they can distort the audio passing through them. But this happens only when the signal level changes through the capacitor. At very high frequencies, all of the audio gets through; and at very low frequencies, none of the audio gets through. Distortion occurs only when some of the audio passes and some is blocked.