cal wrote:
On 05/10/10 18:51, Arnout Engelen wrote:
Latency? Or jitter?
Not sure - possibly the main reason for the post was to seek help in resolving
fallacies in my thinking. With a jack period of 128, a midi event associated
with frame 129 won't actually get applied to audio generation until the
generation of frames 257-384. On the other hand an event against frame 128 will
make it into the generation of frames 128-256.
You seem to be assuming that when you are generating the sound for a
period, and when you find a new event, you have to put the event's
effect into the entire period, i.e., the event's note then starts at the
start of the period.
What you have to do is to remember the timestamp of the event (measured
in frames, like above), and then apply the event at that time in the
period that corresponds to the original time, relative to the period in
which it was received. From your example above, the audio data for the
event received for frame 128 should start at the last frame of the
period, i.e., frame 256, while the audio data for the event received for
frame 129 starts at the first frame of the next period, frame 257.
Jitter is defined as varying latency. You can remove it by taking the
worst-case latency (if it exists) (one period in this case) and applying
it to all events.
Regards,
Clemens