Schiit Happened: The Story of the World's Most Improbable Start-Up
Feb 14, 2023 at 4:38 PM Post #110,581 of 153,859
....

Stated? Yes, I believe that Jason did say that more than once; that he heard a marked difference in sound quality with a full Unison source-to-DAC chain.
Someone out there with a better memory than me can likely point you to the video that he said it in...

Proves? If only they could find an objective method to prove this... classic electrical measurements prove little or nothing, otherwise Folkvangr wouldn't exist. :) Seriously, if it sounds better to me... only my wallet suffers. I'm so cheap that I squeak; audio equipment is my only vice.

I'll let you guys know what I discover, as soon as I get my hands on Urd.

The Unison connection is a filtered/restored connection. A good one I am sure that de-jitters and cleans up a noisy digital stream. But nothing more. So:

Consider a bit stream source with a USB output and a SPDIF output. Connect those ports to a Schiit DAC with both types of ports. It should be possible to compare in the DAC, the USB Unison bitstream (post Unison receiver) to the SPDIF bitstream (post optical detection). The bitstreams should be identical.
 
Feb 14, 2023 at 4:56 PM Post #110,582 of 153,859
The Unison connection is a filtered/restored connection. A good one I am sure that de-jitters and cleans up a noisy digital stream. But nothing more. So:

Consider a bit stream source with a USB output and a SPDIF output. Connect those ports to a Schiit DAC with both types of ports. It should be possible to compare in the DAC, the USB Unison bitstream (post Unison receiver) to the SPDIF bitstream (post optical detection). The bitstreams should be identical.
I'm sorry, but this makes little sense to me.

The whole reason for the WYRD was to clean up USB and the Eitr was to clean up optical.

To my ears, optical is the last resort, I always preferred coax when setting up equipment in our shops.
 
Feb 14, 2023 at 5:01 PM Post #110,583 of 153,859
I'm sorry, but this makes little sense to me.

The whole reason for the WYRD was to clean up USB and the Eitr was to clean up optical.

To my ears, optical is the last resort, I always preferred coax when setting up equipment in our shops.

What doesn't make sense? I simply suggested doing a bit stream comparison after reception on the DAC side. There should be an answer for that. Either there is no difference or one is cleaner or more faithful to the original than the other.
 
Feb 14, 2023 at 5:02 PM Post #110,584 of 153,859
I'm sorry, but this makes little sense to me.

The whole reason for the WYRD was to clean up USB and the Eitr was to clean up optical.

To my ears, optical is the last resort, I always preferred coax when setting up equipment in our shops.
Eitr was a USB to S/Pdif IIRC.
 
Feb 14, 2023 at 5:05 PM Post #110,585 of 153,859
What doesn't make sense? I simply suggested doing a bit stream comparison after reception on the DAC side. There should be an answer for that. Either there is no difference or one is cleaner or more faithful to the original than the other.
I think you are not remembering the effects of jitter
 
Feb 14, 2023 at 5:06 PM Post #110,586 of 153,859
Feb 14, 2023 at 5:11 PM Post #110,587 of 153,859
Sorry, and thank you for the clarification, but you get the idea - decrapify USB

I'm currently using the Schiit Wyrd between my NUC and Ares DAC. Wyrd makes a small but noticeable difference in my system.
The Wyrd (and probably the Ares) will go up for sale when my OG Yggy+ shows up, hopefully by the end of the month.
 
Feb 14, 2023 at 5:32 PM Post #110,588 of 153,859
I think you are not remembering the effects of jitter
I said that... that Unison is a de-jitter restoration filter. My question is... how does a Unison treated bit stream, compare to a SPDIF detected bit stream. Simple question.

I am not saying there I son value in Unison. Of course there is, and if one uses a USB port out of a computer to run a DAC Unison is great.

My question is academic. I was interested in learning about post-detection in a DAC how does a Unison filtered USB stream compare to a SPDIF stream of the same track.
 
Last edited:
Feb 14, 2023 at 5:43 PM Post #110,589 of 153,859
So " the 1700E can manage pristine, faithful playback with optimal accuracy, even restoring information that was lost during the original digital recording." Really???? Any ideas on what this means? Up-sampling? Just marketeering?
 
Feb 14, 2023 at 5:56 PM Post #110,590 of 153,859
Screenshot_20230213_215746_Drive.jpg
Pretty cryptic!

Speakers in a graveyard, looking battered and bruised.

Rock/Pop Concert underground... many people.... bright lights..... weird looking mech spider hanging from the stage.....

Is schiit dipping their toes into a new listening experience?

Personal ear monitors with a built in lazer light show display and a subpac for your bottom?

Bring the rave wherever you go?

Makes a guy wonder if they had planned to release a product before this issue released but got hung up with supply chain n production issues.
My thought: it's a variation on the photo of the reflecting pool: the speakers, reflected in the water, appear as the live performance. At least when powered by Schiit.

So " the 1700E can manage pristine, faithful playback with optimal accuracy, even restoring information that was lost during the original digital recording." Really???? Any ideas on what this means? Up-sampling? Just marketeering?
Absolutely not marketeering. Remember, Denon is a Japanese company and mysterious things happen in Japan.
 
Feb 14, 2023 at 6:27 PM Post #110,591 of 153,859
The Unison connection is a filtered/restored connection. A good one I am sure that de-jitters and cleans up a noisy digital stream. But nothing more. So:

Consider a bit stream source with a USB output and a SPDIF output. Connect those ports to a Schiit DAC with both types of ports. It should be possible to compare in the DAC, the USB Unison bitstream (post Unison receiver) to the SPDIF bitstream (post optical detection). The bitstreams should be identical.
They really aren't. Oh boy, it ain't even close.

For starters S/PDIF is a (potentially lossy) real-time protocol, UAC ("USB Audio Class", the audio transport layer protocol for USB) isn't (necessarily).

In simple terms this means that S/PDIF streams out your bits at a constant rate. For that, your raw audio data gets chopped into packages of up to 24 bytes in size. (That's done on a per-channel basis, usually alternating between the two.) The data packet will then be prepended with an 8 byte long metadata header containing info like the source number, the channel number, clock accuracy, copy restrict/permit, and word length, among other things, to result in a package of 32 bytes in total. That 24 bytes long chunk of audio data prepended by its 8 bytes long metadata header then gets sent out to the receiver as a 32 bytes long package, immediately followed by the next package, ad infinitum.
The connection between the sender and the receiver is strictly unidirectional (meaning: one way) and happens in real-time, meaning that there is (usually) little to no caching involved on either end of the line, and if an error is encountered by the receiver (which can be tested for by using the parity bit that is embedded in the packet's metadata header), the receiver has no way to call back to the sender and ask for a do-over. It'll then either have to discard the package, or play the audio with the potential error left unchanged.

S/PDIF is also what's called a hybrid signal, meaning that the data it transmits is digital (the signal states are either interpreted as "high" or "low", but no shades of gray in between), but the timing of the signal is entirely analog.
Normally, the data rate of the bus is not related to the bits that are being sent. But with S/PDIF, the send rate IS the data rate. Meaning that the sender switches the signal to a high or low state as needed for the data that is to be transmitted, and the receiver samples that signal's state at a fixed interval. The clocking for this signal is determined by the sender. If the sender's clock is unable to sustain a highly regular clock interval, any fluctuation in the "rhythm" of the signal will manifest itself in the receiver as jitter. The bigger your clocking mismatch is between the sender and the receiver, the higher your chance for errors. Same goes for cable quality and length, shielding, etc.

In terms of protocol, there's no difference in fiber vs. metal cable. All that changes is the medium, everything else remains the same.

UAC, in contrast, is an entirely different animal and hugely depends on which protocol is used in particular (UAC 1, 2, or 3), and how the clocking mechanism is implemented in the receiving device.

Information can be sent via USB in one of four different modes: Either als control transfer (where a device "asks" for data and gets responded to in due time, sort of how networks usually work), as interrupt transfer (where the sender sends an interrupt signal that tells the receiver to start listening no matter what else it is currently doing), bulk transfer (where one device opens a connection and then streams a finite amount of data to the receiver, like print jobs) and through isochronous transfer (often used for audio streams).

"Full-speed" USB devices communicate with a data rate of 12MHz. But they don't send the data constantly. Instead, it gets chopped into smaller chunks, called frames, that are then sent out in 1kHz intervals — or 1,000 frames a second. That leaves room available for other USB devices to communicate with another over the same cable.

For isosynchronous transfer, your computer (or other source device) first loads large chunks of audio data into memory, and then spools the data to the output port in a continuous stream of 1,000 frames a second. No metadata headers are sent beyond whatever is required for the handshake during establishment of the connection between the sender and the receiver. The rate at which the data is sent is determined by the oscillator responsible for the USB bus (i.e. the host device's USB controller), so that no mismatch between the sender and the receiver can exist. This rate is also entirely independent of anything else that is going on inside the sender (i.e. the PC) or the receiver (i.e. the DAC) This constant stream keeps going whether there is actual audio data available or not. In theory, this guarantees a constant flow of data. In practice, you can end up with "holes" in your transmission if the sending device's processor gets hogged by other processes and can't keep up suppling new frames to the USB cache in time to be streamed out.

For isosynchronous transfers, there are three methods of clock synchronization available: Synchronous, adaptive, and asynchronous.

For synchronous clock synchronization between the host and the client, the bus signal itself is used. For that the receiving device uses the 1kHz rate at which it receives data frames, from which it then derives its own clock. That's almost always very jittery and is thus no longer really being used in audio. But it WAS the standard when UAC first arrived, which is why @Baldr insisted for so long (and rightfully so) that USB sounds like ass when compared to the somewhat more stable S/PDIF clocking mechanism.

In adaptive clock synchronization, the client provides its own clock. A control circuit has to be implemented that constantly measures the incoming data rate and determines from that an average to which it adjust the client's clock. Since the clock is now independent of the bus signal, it is not as susceptible to jitter. But the two device clocks can still drift apart, which will lead to buffer under/overflows. To avoid that, the client keeps sampling the incoming data rate and will constantly attempt to adapt to that jitter or drift, but the result is that while you're less likely to drop any information in the frequency domain, your audio stream's accuracy will absolutely suffer in the time domain. This is, to my understanding, how USB-to-Unison connections work.

For the third method, asynchronous synchronization, a third, external clock is used to get the data out of the sender's buffer at a constant and controlled rate to the receiver, and it will manage the buffer size to ensure that there's always enough data available to stream. This makes the entire thing independent from the bus clock and any jitter that might be inherent to that, and thus ensures the highest possible accuracy in the frequency as well as the time domain. This is, to my understanding, how Unison-to-Unison connections work.
 
Last edited:
Feb 14, 2023 at 6:34 PM Post #110,592 of 153,859
Last edited:
Feb 14, 2023 at 6:38 PM Post #110,594 of 153,859
They really aren't. Oh boy, it ain't even close.

For starters S/PDIF is a (potentially lossy) real-time protocol, UAC ("USB Audio Class", the audio transport layer protocol for USB) isn't (necessarily).

In simple terms this means that S/PDIF streams out your bits at a constant rate. For that, your raw audio data gets chopped into packages of up to 24 bytes in size. (That's done on a a per-channel basis, usually alternating between the two.) The data packet will then be prepended with an 8 byte long metadata header containing info like the source number, the channel number, clock accuracy, copy restrict/permit, and word length, among other things, to result in a package of 32 bytes in total. That 24 bytes long chunk of audio data prepended by its 8 bytes long metadata header then gets sent out to the receiver as a 32 bytes long package, immediately followed by the next package, ad infinitum.
The connection between the sender and the receiver is strictly unidirectional (meaning: one way) and happens in real-time, meaning that there is (usually) little to no caching involved on either end of the line, and if an error is encountered by the receiver (which can be tested for by using the parity bit that is embedded in the packet's metadata header), the receiver has no way to call back to the sender and ask for a do-over. It'll then either have to discard the package, or play the audio with the potential error left unchanged.

S/PDIF is also what's called a hybrid signal, meaning that the data it transmits is digital (the signal states are either interpreted as "high" or "low", but no shades of gray in between), but the timing of the signal is entirely analog.
Normally, the data rate of the bus is not related to the bits that are being sent. But with S/PDIF, the send rate IS the data rate. Meaning that the sender switches the signal to a high or low state as needed for the data that is to be transmitted, and the receiver samples that signal's state at a fixed interval. The clocking for this signal is determined by the sender. If the sender's clock is unable to sustain a highly regular clock interval, any fluctuation in the "rhythm" of the signal will manifest itself in the receiver as jitter. The bigger your clocking mismatch is between the sender and the receiver, the higher your chance for errors. Same goes for cable quality and length, shielding, etc.

In terms of protocol, there's no difference in fiber vs. metal cable. All that changes is the medium, everything else remains the same.

UAC, in contrast, is an entirely different animal and hugely depends on which protocol is used in particular (UAC 1, 2, or 3), and how the clocking mechanism is implemented in the receiving device.

Information can be sent via USB in one of four different modes: Either als control transfer (where a device "asks" for data and gets responded to in due time, sort of how networks usually work), as interrupt transfer (where the sender sends an interrupt signal that tells the receiver to start listening no matter what else it is currently doing), bulk transfer (where one device opens a connection and then streams a finite amount of data to the receiver, like print jobs) and through isochronous transfer (often used for audio streams).

"Full-speed" USB devices communicate with a data rate of 12MHz. But they don't send the data constantly. Instead, it gets chopped into smaller chunks, called frames, that are then sent out in 1kHz intervals — or 1,000 frames a second. That leaves room available for other USB devices to communicate with another over the same cable.

For isosynchronous transfer, your computer (or other source device) first loads large chunks of audio data into memory, and then spools the data to the output port in a continuous stream of 1,000 thousand frames a second. No metadata headers are sent beyond whatever is required for the handshake during establishment of the connection between the sender and the receiver. The rate at which the data is sent is determined by the oscillator responsible for the USB bus (i.e. the host device's USB controller), so that no mismatch between the sender and the receiver can exist. This rate is also entirely independent of anything else that is going on inside the sender (i.e. the PC) or the receiver (i.e. the DAC) This constant stream keeps going whether there is actual audio data available or not. In theory, this guarantees a constant flow of data. In practice, you can end up with "holes" in your transmission if the sending device's processor gets hogged by other processes and can't keep up suppling new frames to the USB cache in time to be streamed out.

For isosynchronous transfers, there are three methods of clock synchronization available: Synchronous, adaptive, and asynchronous.

For synchronous clock synchronization between the host and the client, the bus signal itself is used. For that the receiving device uses the 1kHz rate at which it receives data frames, from which it then derives its own clock. That's almost always very jittery and is thus no longer really being used in audio. But it WAS the standard when UAC first arrived, which is why @Baldr insisted for so long (and rightfully so) that USB sounds like ass when compared to the somewhat more stable S/PDIF clocking mechanism.

In adaptive clock synchronization, the client provides its own clock. A control circuit has to be implemented that constantly measures the incoming data rate and determines from that an average to which it adjust the client's clock. Since the clock is now independent of the bus signal, it is not as susceptible to jitter. But the two device clocks can still drift apart, which will lead to sampling errors. To avoid that, the client keeps sampling the incoming data rate and will constantly attempt to adapt to that jitter or drift, but the result is that while you're less likely to drop any information in the frequency domain, your audio stream's accuracy will absolutely suffer in the time domain. This is, to my understanding, how USB-to-Unison connections work.

For the third method, asynchronous synchronization, a third, external clock is used to get the data out of the sender's buffer at a constant and controlled rate to the receiver, and it will manage the buffer size to ensure that there's always enough data available to stream. This makes the entire thing independent from the bus clock and any jitter that might be inherent to that, and thus ensures the highest possible accuracy in the frequency as well as the time domain. This is, to my understanding, how Unison-to-Unison connections work.
What he said 🤪

Also Mike likes the sound of Unison better than spdif if I'm not mistaken, so there's the subjective aspect to consider.
 
Feb 14, 2023 at 6:39 PM Post #110,595 of 153,859
They really aren't. Oh boy, it ain't even close.

For starters S/PDIF is a (potentially lossy) real-time protocol, UAC ("USB Audio Class", the audio transport layer protocol for USB) isn't (necessarily).

In simple terms this means that S/PDIF streams out your bits at a constant rate. For that, your raw audio data gets chopped into packages of up to 24 bytes in size. (That's done on a a per-channel basis, usually alternating between the two.) The data packet will then be prepended with an 8 byte long metadata header containing info like the source number, the channel number, clock accuracy, copy restrict/permit, and word length, among other things, to result in a package of 32 bytes in total. That 24 bytes long chunk of audio data prepended by its 8 bytes long metadata header then gets sent out to the receiver as a 32 bytes long package, immediately followed by the next package, ad infinitum.
The connection between the sender and the receiver is strictly unidirectional (meaning: one way) and happens in real-time, meaning that there is (usually) little to no caching involved on either end of the line, and if an error is encountered by the receiver (which can be tested for by using the parity bit that is embedded in the packet's metadata header), the receiver has no way to call back to the sender and ask for a do-over. It'll then either have to discard the package, or play the audio with the potential error left unchanged.

S/PDIF is also what's called a hybrid signal, meaning that the data it transmits is digital (the signal states are either interpreted as "high" or "low", but no shades of gray in between), but the timing of the signal is entirely analog.
Normally, the data rate of the bus is not related to the bits that are being sent. But with S/PDIF, the send rate IS the data rate. Meaning that the sender switches the signal to a high or low state as needed for the data that is to be transmitted, and the receiver samples that signal's state at a fixed interval. The clocking for this signal is determined by the sender. If the sender's clock is unable to sustain a highly regular clock interval, any fluctuation in the "rhythm" of the signal will manifest itself in the receiver as jitter. The bigger your clocking mismatch is between the sender and the receiver, the higher your chance for errors. Same goes for cable quality and length, shielding, etc.

In terms of protocol, there's no difference in fiber vs. metal cable. All that changes is the medium, everything else remains the same.

UAC, in contrast, is an entirely different animal and hugely depends on which protocol is used in particular (UAC 1, 2, or 3), and how the clocking mechanism is implemented in the receiving device.

Information can be sent via USB in one of four different modes: Either als control transfer (where a device "asks" for data and gets responded to in due time, sort of how networks usually work), as interrupt transfer (where the sender sends an interrupt signal that tells the receiver to start listening no matter what else it is currently doing), bulk transfer (where one device opens a connection and then streams a finite amount of data to the receiver, like print jobs) and through isochronous transfer (often used for audio streams).

"Full-speed" USB devices communicate with a data rate of 12MHz. But they don't send the data constantly. Instead, it gets chopped into smaller chunks, called frames, that are then sent out in 1kHz intervals — or 1,000 frames a second. That leaves room available for other USB devices to communicate with another over the same cable.

For isosynchronous transfer, your computer (or other source device) first loads large chunks of audio data into memory, and then spools the data to the output port in a continuous stream of 1,000 thousand frames a second. No metadata headers are sent beyond whatever is required for the handshake during establishment of the connection between the sender and the receiver. The rate at which the data is sent is determined by the oscillator responsible for the USB bus (i.e. the host device's USB controller), so that no mismatch between the sender and the receiver can exist. This rate is also entirely independent of anything else that is going on inside the sender (i.e. the PC) or the receiver (i.e. the DAC) This constant stream keeps going whether there is actual audio data available or not. In theory, this guarantees a constant flow of data. In practice, you can end up with "holes" in your transmission if the sending device's processor gets hogged by other processes and can't keep up suppling new frames to the USB cache in time to be streamed out.

For isosynchronous transfers, there are three methods of clock synchronization available: Synchronous, adaptive, and asynchronous.

For synchronous clock synchronization between the host and the client, the bus signal itself is used. For that the receiving device uses the 1kHz rate at which it receives data frames, from which it then derives its own clock. That's almost always very jittery and is thus no longer really being used in audio. But it WAS the standard when UAC first arrived, which is why @Baldr insisted for so long (and rightfully so) that USB sounds like ass when compared to the somewhat more stable S/PDIF clocking mechanism.

In adaptive clock synchronization, the client provides its own clock. A control circuit has to be implemented that constantly measures the incoming data rate and determines from that an average to which it adjust the client's clock. Since the clock is now independent of the bus signal, it is not as susceptible to jitter. But the two device clocks can still drift apart, which will lead to sampling errors. To avoid that, the client keeps sampling the incoming data rate and will constantly attempt to adapt to that jitter or drift, but the result is that while you're less likely to drop any information in the frequency domain, your audio stream's accuracy will absolutely suffer in the time domain. This is, to my understanding, how USB-to-Unison connections work.

For the third method, asynchronous synchronization, a third, external clock is used to get the data out of the sender's buffer at a constant and controlled rate to the receiver, and it will manage the buffer size to ensure that there's always enough data available to stream. This makes the entire thing independent from the bus clock and any jitter that might be inherent to that, and thus ensures the highest possible accuracy in the frequency as well as the time domain. This is, to my understanding, how Unison-to-Unison connections work.
Excellent! You are confirming that Unison mitigates/corrects the many ills of USB.

Now even if SPDIF's clock is fixed... why wouldn't the receiver buffer the stream and re-clock it at low jitter?

Edit: coax question previously answered.
 
Last edited:

Users who are viewing this thread

Back
Top