[linux-audio-dev] What parts of Linux audio simply suck ?

Juan Linietsky coding at reduz.com.ar
Mon Jun 20 14:57:24 UTC 2005


Paul Davis wrote:

>>-Alsa/Jack integration in timestamping is poor, syncronizing audio to 
>>midi is a pain
>>    
>>
>
>i suspect that you don't understand (or perhaps even know about) the
>DLL-driven jack_frame_time().
>
>  
>
What is DLL-Driven? I used jack_frame_time() in chionic, It works, but it's
not enough. It's dont mean that it's jack's fault itself. My experience
with it was that I made a realtime ultrahigh priority thread for receiving
alsa events and timestaming them, yet under high cpu, disk load (streaming
sequence tracks to disk), or other
conditions (ksysguard continuously monitoring the proc filesystem) the 
events
that alsa delivered to me were jittered in a very noticeable way. I 
tried this
in some machines with the same results in all. It works, just not in all 
cases.
This is why I insisted so much in jack-midi lately.

>>-Jack lack of midi
>>    
>>
>
>as noted, released and undergoing subtle revisions as it moves towards
>CVS.
>
>  
>
Yes found out about that after posting, congrats on it! I'll
look forward to use it.

>>-Jack lack of OSC or any way to do parameter automation from the sequencer
>>    
>>
>
>name one platform that allows this. just one.
>  
>
Allow what? controlling your softsynths/effects with automation? I cant 
think
of any platform that DOESNT support this. The problem is that under
linux most programmers write softsynths/or gui-abled effect processors
as JACK or JACK/ALSA clients, thus making it impossible to automate
parameters of them.  Jack doesnt provide any means for this.

>>-It is Impossible to do any sort of offline render, or high quality 
>>render of a song (like, in 32/192khz) using JACK/Alsa
>>    
>>
>
>i think you don't understand jack_freewheel().
>
>  
>
No, I think you dont understand that ALSA is not a part of JACK,
( I wrote jack/alsa above there)
so you cant jack_freewheel with alsa midi events. However, since
now jack will have midi, I'm glad this si not an issue anymore.

>  
>
>>-Adding/Removing softsynths, linking connections, etc takes a while 
>>having to use qjackctl, etc
>>    
>>
>
>tell me a system in which this not true. i use the patchbay in qjackctl;
>if you don't like qjackctl, i'm sorry and i am sure rui is as well.
>  
>
I was refering more about having to run the applications yourself,
and connect them later. qjackctl helps but I think it's still an
annoyance. not to mention the console-only ones, where I need
to have several consoles around. (at least for me)

>  
>
>>-Lack of send%.. I just cant have a jack client doing a very high 
>>quality reverb, only as wet processing and have clients send different 
>>amounts of the signal to it, thus saving CPU
>>    
>>
>
>this is completely ridiculous. the client can attenuate on its inputs.
>where would you rather have these controls - distributed across N apps
>or on the control interface for just one?
>  
>

No, no, just in the connection itself. Imagine the following case:
You have an app running a cpu-costly convolution reverb,
it takes one input, one output. Then you have two softsynths
(or one with individual ports) running two different instruments,
a piano and a violin. How can I send both with different amounts
of reverb to the convolution effect? I want the piano to have 75%
reverb, because given it's characteristics, it will only increase
the decay, but if i send too much reverb for the violin it will get
muddy. In hardware synths or mixers, you allways have sends to
achieve this easily. But you cant do this in jack because I cant
determine how much of the signal i'm sending from one port
to another.

>  
>
>>-Lack of tempo-map based transport, I cant adapt my midi-only sequencer 
>>, which works in bars,
>>    
>>
>
>you can't do tempo-map based transport without sharing the tempo map.
>nobody has suggested a way to do this yet. please feel free.
>
>  
>
I talked with florian schmidt on irc about this, some time ago.
But my impression from what he said is that most jack
developers wanted to keep tempo map sharing in a
separate library, not inside jack. I think he suggested
that should be integrated, and I'd love to see it integrated
too, as I beleive that would ease it's adoption. If you
mean technically, I dont really know, as I dont know
the internals of Jack. I really dont have the time to
mess with that, and much less to discuss with other
5 developers about the most correct way to implement it.
I am responsible for my own software, I cant be responsible
for everything that doesnt work, or doesnt support what I need.

>>All this has simply led me to decide to not use such APIs anymore and 
>>integrate everything I do
>>in big, and monolithic apps, such as reason, cubase, etc and not care 
>>about the outside world anymore.
>>After all, it takes me less time to write the features I need for 
>>myself, and into my own than dealing with people's religious software 
>>views to get them integrated into other projects.
>>    
>>
>
>well, thats sad. i wonder what audio API you'll decide to use ...
>
>
>  
>
I'm still using JACK, except that all the softsynths/effects,
sequencer, and routing is done in-app, the monolithical way.
(same as windows/mac apps). Please dont feel that I'm doing
this because I'm offended or unhappy with everything. My
way of seeing this is much simpler, it's that I cant rely on the
existing APIs and APPs, and I feel that It's a lot more troublesome
to deal with everything and everyone to get what features I need 
implemented,
than simply doing it myself in a monolithic-fashion.

Cheers!

Juan





More information about the Linux-audio-dev mailing list