Jack MIDI timestamping Was: Re: [linux-audio-dev] What parts of Linux audio simply suck ?

Stéphane Letz letz at grame.fr
Mon Jun 20 14:01:58 UTC 2005


Le 20 juin 05 à 14:33, Benno Senoner a écrit :

> Stéphane Letz wrote:
>
>
>>
>>
>> Good question....
>>
>> I would say that one the may problem is that jack MIDI does not  
>> has the concept of time-stamps in the "future" . The Jack MIDI  
>> "events" are received in real-time (similar to what MidiShare does  
>> with the Receive callback concepts) but have also to be delivered  
>> in real-time.
>> In the absence of any scheduling mechanism,  I don't think the  
>> MidiShare API could be re-built on top of the underlying Jack MIDI  
>> buffer transports system.
>>
>
> I did not look at the jack midi API yet but what do you mean with  
> time-stamps in future ?
> timestamped MIDI events relative to the current audio fragment or  
> in future in general, eg deliver this midi event
> in 100000 samples starting from the current position ?
>
> I think it's probably enough that jack provides the former.
> eg my audio fragment is 128 frames
> note on ch 1, note 60 velocity 100 at frame 20
> note on ch 1, note  64 velocity 80 at frame 100
>
> so the timestamp is always between 0 and 127 (fragmentsize).
>
> AFAIK VST does it what way and sequencers are perfectly fine with  
> that scheme.
>
> I don't believe the future timestamping (with timestamp >  
> fragmentsize) provides many advantages
> since it leads to long queues and poor interactivity.
> Eg if I want to modify the midi data in real time the first  
> approach works better since the latency
> of the modifications to the midi data becoming audible will be only  
> 1-2 audio fragments.
>
> I think that the argument of needing a kernel based midi API that  
> timestamps event is the future because
> the user space processes cannot provide decent timing is moot these  
> days because of the excellent RT performance
> of Linux and that often sound generation is internal (eg virtual  
> midi instruments/samplers) so having
> jack that handles midi+audio can provide better timing (sample  
> accuracy) than the ALSA sequencer and simplifies
> the programming of virtual midi synths/samplers. (no RT midi  
> sensing thread etc).
> Look at VST: 2 simple callbacks: processEvents() and process() to  
> implement sample accurate virtual instruments.
>
> Thought ?
>
> cheers,
> Benno


I was only answering to the initial question of wether a MidiShare  
like API (where event scan have future timestamps, to be delivered at  
their due time to clients by the mean of a scheduler) could be built  
on top of the jack Midi API the way it is designed. And the answer is  
probably not.

Stephane





More information about the Linux-audio-dev mailing list