Hi.
I am currently trying to figure out how to restructure the code
of my little yatm tool to be flexible and extensible in the future.
Basically, I'd like to be able to become a JACK client at
some point in the future. Since JACK uses a poll model, I have
to majorly restructure my code. I think what I need is the following:
Three threads, one for the UI, one for the MPEG/Vorbis/Speex decoder,
and one for audio output (to either libao or JACK).
I'll also need a buffer between the decoder and the audio thread, and some
way to signal the decoder thread that more audio data is needed.
Additionally, I'd like to have the ability to make the buffer
in between remember the last n seconds of audio data and make
it possible to jump back in time <=n seconds during playback.
I've looked at the lock-free ringbuffer of JACK, but I am not
sure if this is really what I need here. I am very new to
linux audio programming, and especially to pthreads (wrote
my first thread using program two days ago).
I'd be very happy if some of you experts here could give me
a hand here and point me into the right direction.
Maybe there is already code out there which does roughly that
which I can look at?
Or maybe you can suggest which mechanisms of pthreads I should
be using for this kind of scenario.
--
CYa,
Mario | Debian Developer <URL:http://debian.org/>
| Get my public key via finger mlang(a)db.debian.org
| 1024D/7FC1A0854909BCCDBE6C102DDFFC022A6B113E44
Greetings:
The European mirror of the Linux soundapps site is back online with a
new address:
http://linuxsound.atnet.at
Please update your bookmarks.
This site is now in sync with the US and Japanese sites.
Best regards,
dp
Hi all,
A question regarding jack use in a live setting. As far as I can tell
the soft mode only works with non realtime jack. What should I do if I
want realtime performance, but a really forgiving server, so dropouts
never cause the server to timeout. It's more important to me to have
continuous rather than glitch free audio (I never ever want everything
to shut down)
If the answer is ultimately to write better jack clients - thats Ok ;)
cheers,
dave
OK.
I have permission to get one of these programs moving with some
constraints (appended). WHich do you guys think would be most useful
(publishable :-) )?
1. The OpenGL Spectrogram implemented into Steve Harris's meterbridge
2. A 31-BAND (or more?) graphic EQ
These would both be implemented into JACK
My advisor requested these specs:
1- This should be run on any platform including PC platform.
<Yeah, right. Perhaps I"ll just implement the code, and wrap it with JACK
and WIN-API with preprocessor commands....Grrr...>
2- The interface would be build with a Tcl/Tk script which is compatible
with SGI/ Aplle and PC platform.
<I'm just not sure he knows what he's talking about...Anyway, I think he
just wants an interface that will work on linux and windows...again, see
#1>
3- The FFT algorithm should be run in parallel and optimize at least for
two platforms.
<I guess once we're splitting up the spectrum, this shouldn't be a
problem...BTW,the advisor teaches an MPI course. This idea probably came
from this...I'm not sure what he means by "2 platforms.">
4- We can use a Wavelet Transformation inside of FFT and to parallize it.
<*shrug* I'm gonna hafta do some research. wikipedia says it's like a FFT
running in O(N) instead of O(N log N):
http://en.wikipedia.org/wiki/Wavelet_transform
>
I can recomend against some of these. I just think he wants me to do some
extra work... Granted, without these recomendations, I'm not looking at
the suggested 500 Hours of work...
This project is starting to look big and clunky:
1. Implementing both windows and linux
2. Parrellizing the code? I don't mind writing multithreaded or MPI... it
just looks like overkill....
3. Can JACK compile on IRIX? That's his OS of choice (Being his only *nix
experience...) and, well, SGI's are fun! Perhaps I can convince him to put
linux on his old O2.
Anyway, bottom line, it'll be fun, and I don't mind spending my time on
this, if ANY part of it will be of use....
...I'de just hate to kill Steve's code, though.
Anyway, suggestions?
Thanks y'all!
-Mike
Hey, I wouldn't mind working on the graphics, I just don't know where to start or
who else is working on it.
Jan
On Tue, 8 Jun 2004 09:18 , Steve Harris <S.W.Harris(a)ecs.soton.ac.uk> sent:
>On Tue, Jun 08, 2004 at 11:45:53 +0200, Marek Peteraj wrote:
>> VST plugins tend to be rather complex, offering tons of features and
>> eyecandish GUIs, while LADSPAs usually offer limited functionality, no
>> GUI at all(hosts usually provide simple ones to control the parameters).
>> But what's interesting is that each LADSPA plugin usually implements
>> exactly one type of DSP technique, for example, an oscillator, or a
>> delay. This basically leads to a situation where a certain DSP technique
>> is 'isolated' in a separate plugin.
>
>I think thats down to two factors (and its not a good thing)
>
>1) LADSPA developers are few in number and short in time. The basics are a
> good place to start.
>
>2) The lack of a UI standard makes complex plugins a bit pointless.
>
>There are a few counter examples (e.g. my VyNil plugin wraps a lot of
>different bits), and infact if you look in many LADSPA plugins you will
>see theres really more going on than there appears to be.
>
>[OT] - my canned plugin writing experience - all generalisations and IMHO
> of course
>
> Time breakdown: 10% writing code, 10% maths and optimising, 80% tweaking
> and tuning.
>
> Mapping the controls 1:1 with DSP parameters makes plugins crap - people
> say they want that if you ask them, but they dont mean it ;)
>
> Fewer controls is better.
>
> Affordance, appearance and usability has as much affect on the perceived
> sound quality as the DSP code (posivly and negativly). Some of this can
> be achieved without a custom UI.
>
>You mentioned JAMin - true that does use LADSPA plugins - but of the total
>ammount of code the LADSPA plugins are a tiny fraction. I just reused them
>because I hate fixing bugs in two places :)
>
>[OOT] I used to think that a UI spec for LADSPA (to make it competetive
>with VST) was a technological problem. I now thinks its a manpower issue
>(I think Paul Davis pointed this out a couple of years ago :). Games
>develpment has moved to the point where the graphics work is more
>expensive than the software development, and I bet its not far off in
>plugin / eyecandy app development. We have no, or almost no, graphics
>people here.
>
>There are plenty of graphics people working on Free Software projects, but
>they all seem to be working on games projects. What a waste. I guess
>drawing goblins is more fun than sliders and LEDs. Who knew? ;)
>
>From: Alfons Adriaensen <fons.adriaensen(a)alcatel.be>
>
>On Fri, Jun 11, 2004 at 08:37:09AM -0400, Paul Davis wrote:
>
>> you cannot modify the graph in JACK while the graph is being used to
>> process audio. you do not know how long the graph modification will
>> take if you try to do it (for example) right after you're done with
>> one process cycle. the only sure way to do this is to use lock free
>> parallel programming techniques.
Would anyone please explane details of these techniques?
Maybe it is something I know already but it is not clear.
>The 'heavy state' is AFAICS not the real problem.
>Heavy things can be put in place before the actual graph reordering
>is done or removed after. This does not need to be done in the RT
>thread.
For speeding up this approach one can also use caches for the items.
The engine butler can initialize module structures, delay lines, etc.
at its spare time. These initialized items are placed to cache. When
the engine needs an initialized item, it gets the item instantly from
the cache. (No patent pending.)
Juhana
Does anybody have any opinion on which threading system is superior?
I've been using glib for a lot of things, but for whatever reason I'm
hesitant about using it for threading if the only benefit it will
provide is consistency (I'm guessing it's just a wrapper for pthread
anyway).
[pb]
Greetings:
Can someone explain why TiMidity eventually hogs the CPU at 95% or
more after running for a while (like 12 hours or more) ? I'm talking
about hogging the chip while TiMidity is idling, not playing. I'm using
it as a softsynth, it works well, but even in the latest version its CPU
usage just soars. Here's how I'm invoking 2.13.0 :
timidity -iA -B2,8 -c /home/dlphilp/timidity.cfg -A100 -Oj
-EFreverb=0 -EFchorus=0
Takashi: Obviously I spoke too soon in my earlier message. I just
looked at top again and saw that TiMidity was eating up 96% of the CPU. :(
Best,
dp
Hey,
I posted this a few weeks ago, here is the original thread:
http://eca.cx/lad/2004/06/0046.html
After doing some research it seems that the issue is actually with
JACK. The SBLive does not allow the hardware capture buffer to be set
to any values other than these:
static unsigned int capture_period_sizes[31] = {
384, 448, 512, 640,
384*2, 448*2, 512*2, 640*2,
384*4, 448*4, 512*4, 640*4,
384*8, 448*8, 512*8, 640*8,
384*16, 448*16, 512*16, 640*16,
384*32, 448*32, 512*32, 640*32,
384*64, 448*64, 512*64, 640*64,
384*128,448*128,512*128
};
This limits the allowed period sizes you can set with
snd_pcm_hw_params_set_period_size; this is what jackd uses to set the
period size to whatever is specified in the -p argument.
However as far as I can tell, you should be able to use the ALSA
xfer_align parameter to set this to, say, 128. Then, you set the
playback period size to 128, and whenever you get an interrupt from the
playback buffer running low you write the next 128 frames to the output
device and read the next 128 from the capture buffer.
Since the different sized hardware buffers for capture and playback are
a quirk of the SBLive, I think this would require hardware-specific code
for the EMU10K1 in JACK, the way there is one for the Hammerfall, HDSP,
and ICE1712 already.
Would someone with more low level knowledge of JACK and ALSA care to
comment?
Lee