Support |
> > > According to the newest Electronic Musician magazine which has an >article > on > > latency (for computer recording), AD or DA takes 1.5 ms. > > . . . > > I'll have to read the article but this does not seems correct. > > For AD at a sampling frequency of 44.1 kHz, I'd expect a latency of 1 / > 44100 or approx 22.7 microseconds. I.e., you'll get a new sample point > every 22.7 microseconds. Now you could buffer up a bunch of sample >points > and output them later (producing latency), but they still being produced >at > a rate of 44,100 per second. > > Likewise for DA at 44.1 kHz. > > My guess is that the EM article is not discussing a pure AD then DA >process. > > Dennis Leas > ------------------- > dennis@mdbs.com > A/D conversion does not happen instantaneously. In a successive approximation A/D (a common type) the number of bits of resolution determines the conversion time. For example, for a microprocessor that I use on my robots (PIC16C7X), the A/D conversion time per bit, TAD, is 1.6us and it takes 9.5 TAD to do an 8-bit conversion (in general it's 1 TAD per bit plus some constant). Then you have to figure in the sampling time (the time that an internal capacitor is charged to the voltage being sampled), which happens before the conversion even starts. This is 12us on the PIC. So on the PIC the delay would be ~27us. But this is only for 8 bits--the conversion time would be 3x more for 24 bits plus the sampling time is probably much longer since a higher resolution is required. D/A as it is commonly implemented (1-bit delta-sigma) also has inherent delay... Cheers, Keith