most
softsynths are zero latency. only those that work in the
frequency domain tend to have non-zero latency. by this i mean that
the synth will generate output for a given time period at the same
time as processing any input it receives (which may be nothing at all)
for the same time period.
Yes, but this output has to pass through a buffer and that introduces
delay.
this is (mostly) accounted for by the latency values for the ports
the synth is connected to. for example, any alsa_pcm:playback port
will report a latency equal to the audio interface hardware buffer
size (which is a good approximation to the value). systemic latency
beyond the audio interface hardware buffer (such as D/A converters)
are not (currently) accounted for. therefore, if you ask for the
latency on the output port of the softsynth, it will include this
value automatically (if its connected).
With filters in the frequency domain you mean stuff like fourier
and wavelet analysis where you have to have a certain
windowsize to get
low frequencies.. In these cases you would have two latencies which add
up. the filter/softsynth latency and the output latency.
see above. the client sets its own port
latency. jack_port_get_gotal_latency() will sum the latencies
between the port and physical port.
In the case that the softsynth is connected to a track
in ardour,
there's also a possible delay introduced for this [softsynth
output]->[ardour track input] - connection (i don't know the details)? I
suppose, this is also compensated for? In this case handling the output
there are no delays. any data generated by the softsynth on cycle N is
processed by ardour on cycle N. the results are output on cycle N.
ardour does
exactly this. it delays all tracks by various amounts so
that they sync up with the most latent/delayed one. this includes
internal delays caused by plugins, and external delays due to signal
routing.
This is so cool :) I have to try that and find a midi sequencer that can
act as jack transport slave..
we're still working on fixing the adjusted position of recorded
material to reflect the worst case output latency. right now, ardour
positions captured material to reflect when the pressure wave hit the
microphone (in the common case of recording acoustic material, and
ignoring A/D delays). it turns out that this is wrong a lot of the
time - when multitracking, the recorded material needs to be adjusted
to line up with what it was *played with*. its quite complex to get
this right in every situation, but we're trying our best.
--p