What looks like "perfect" filter (if we talk about steepness) from frequency domain point of view is absolutely opposite from time domain point of view. Because these are related through inverse
1/x relationship.
Likewise, in mathematically related aspect, when you do spectrogram with FFT (like in HQPlayer metering view) you need to choose transform length. When you make the transform longer, you gain more frequency resolution (in digital filters this means steeper roll-off), but your time resolution to detect changes suffers. You need to choose suitable balance between the two.
This time-frequency uncertainty is called Fourier
uncertainty principle. Also remember that
hearing can beat this uncertainty principle. See also
here.
Yeah I think I have a bit of a grasp of what the uncertaity principle tells us. It also states that a band-limited signal must necessarily be infinite in length and that's why our recordings are not perfectly band-limited (and why we need short filters in the first place instead of super-long sinc filters).
The inverse 1/x relationship would make it seem like the perfect brickwall filter would be infinitely bad in the time domain since it rings infinitely long, and perfect in the frequency domain.
However since the perfect brickwall filter will perfectly reconstruct a sampled bandlimited signal, apparently this infinite badness in the time domain is of no consequence to bandlimited signals. (this is actually the same thing as saying a reconstruction filter will not ring when fed a bandlimited pulse, now that I think of it).
Which is I guess counterintuitive.
Also, since Currawong came into this thread asking about filter ringing (and stating under normal circumstances a dac filter doesn't ring) and kennyb123 talking about PGGB (which in this myths page
https://www.remastero.com/faq-plus.html#Myths takes aim at the HQPlayer philosophy that digital filtering is a compromise between time and frequency domain) I have a need to figure out who is "right".
I did a little thought experiment where we'd have to assume two different types of recordings:
- A recording of an instrument with no frequency content above 20khz. No need to bandlimit this signal when putting it through an adc since it is inherently bandlimited and no ringing will get baked into the recording because of the adc filter.
- A recording of an instrument that had frequency content above 20khz. This recording will contain all the frequencies of the instrument below 20khz, plus ringing at the adc filter frequency where the higher frequencies had to be filtered out.
- Note that both these recordings will still not be bandlimited (remember, bandlimited signals don't exist in the real world). So both of them will have content above 20khz. What this looks like or where it comes from I don't know, I still have trouble wrapping my head around this fact. See this Wikipedia excerpt:
Now what happens if we try to reconstruct both these recordings with a perfect brickwall filter?
-In case one we get the exact original signal without ringing, plus ringing artifacts for all the out-of-band parts of the recording.
-In case two we get the exact original signal after bandlimiting, plus all the ringing from the adc filter, plus ringing artifacts for all the out-of-band parts of the recording.
A short apodizing filter would in case two replace the adc filter ringing with its own presumably better ringing, and would ring less as a result of the out-of-band parts.
From this little thought experiment I draw three conclusions:
1. The HQPlayer view and the Chord/PGGB view of signal reconstruction do not have conflicts when it comes to the sampling theorem.
2. The HQPlayer view is that adc ringing + out-of-band signal ringing is quite detrimental to the quality of the reconstructed signal.
3. The Chord/PGGB view is that adc ringing + out-of-band signal ringing is completely inconsequential for the reconstructed signal as humans cannot hear the ringing and there is barely any out-of-band signal energy in recordings anyway.
Now this became quite a long post but it pains me that our audiophile community seems yet again to have devolved into two camps and I naively would like to figure out how we can prove which view is right, especially since this seems to be one of the rare audiophile discussions that might actually be resolved by scientific means.
Anyone with any ideas on how we could visualise the time-domain badness of the ringing caused by out-of-band components of the signal?
My ears tell me short filters sound quite different compared to long ones so the badness must be significant, but unfortunately listening impressions are not the same as evidence