Just wondering if I understand this correctly. I making a loop based app
for step sequencing. When I previously did this in Csound, I clocked it off
a phasor, so the timing was sample accurate ( but that brought all it's own
issues to be sure ). I'm wondering whether I should do the same thing in
jack app, or use the jack transport clock, or some hybrid.
My question, am I correct in understanding that if I use the jack transport
position to rewind in time, I'll get:
C) any other clients with running audio looping back to ( may or may not be
desirable )
B) a jitter based on the amount of time left between when the loop should
end and the end of the frame buffer in which the loop length runs out?
Has anyone solved B? Could it be done by some complex tempo cheating trick?
Does anyone have any methods they've used for tight timing of looping in a
jack app?
Pointers at code appreciated of course. =)
thank!
Iain
On , Iain Duncan <iainduncanlists(a)gmail.com> wrote:
> Thanks! Did you just write it?
Yup. As in literally just there. And I was reading your post in the new
RAUL thread as you were typing that :D
All the best, -Harry
On Tue, Nov 22, 2011 at 09:13:37PM +0100, Nick Copeland wrote:
> If you are using a toolkit that has a data flow of the following:
>
> pointer motion->graphical display->values->application->output
>
> Well, basically that is broken as you have a flow that is
>
> input->output->input->application->output
>
> invariably that is going to lead to issues. The tail (the toolkit) is wagging the dog
> (the application) as it imposes restrictions on the values the application is allowed
> to see.
>
> In my opinion (ok, it is only opinion) is that the correct flow is
>
> input->application->output
Yes, I see your point, and it makes a lot of sense. So what would be
required is
* compute the new parameter value from
- a stored state in 'paramater space' rather than 'widget space'
- and pointer (mouse) gestures,
* update the widget according to that value.
This is more or less what I do in the rotary controls used
in e.g. zita-at1 and zita-rev1. It's possible because the
mouse movement and the visual representation of the value
(the angle of the line on the rotary knob) are not directly
related anyway.
But this is not how most (all) toolkits work.
You could probably use them in the way you suggest with some
extra effort. But in many cases (e.g. linear sliders) the
pointer and widget would have to remain in sync visually,
which then means that your resolution in paramater space
can't be better than the visual one. Unless you allow the
pointer to move faster than the visual object it is controlling
(which is what I do in the 2-D panner, but it's possible only
because the widget is so small).
Ciao,
--
FA
Vor uns liegt ein weites Tal, die Sonne scheint - ein Glitzerstrahl.
Hi,
I am trying to compile Aliki 0.1 in Ubuntu 10.10
I Installed the required libraries
libclthreads (>= 2.4.0) and libclxclient (>= 3.6.1)
and libclalsadrv (>= 2.0.0).
but when I try to compile I get this error:
In file included from aliki.cc:26:
styles.h:26: fatal error: clxclient.h: No such file or directory
compilation terminated.
Note: I found and fixed a problem installing libclxclient, but maybe I
broke something...
when I tried to install clxclient show this error:
In file included from xdisplay.cc:22:
clxclient.h:31: fatal error: X11/Xft/Xft.h: No such file or directory
compilation terminated.
I found a solution in a website and was solved with:
|sudo apt-get install libxft-dev|
thanks,
federico lopez
http://kinlan-presentations.appspot.com/bleeding/index.html#42
............
Don't we already have HTML5 <audio>?
Yes :)...but <audio> can only take us so far
Simple low-latency, glitch-free, audio playback and scheduling
Real-time processing and analysis
Low-level audio manipulation
Effects: spatial panning, low/high pass filters, convolution, gain, ...
...........
Judging by all the google-chrome symbols, this appears to be
google-chrome-specific. Is any of this FOSS and available in Chromium?
-- Niels
http://nielsmayer.com
Just because everyone's tips here were so helpful for the ringbuffer
conversation, does anyone have any pointers for where to start
understanding jack transport and clocking, other than the transport client
example?
thanks!
Iain
Still curious about RAUL. As i have no immediate plans beyond learning how
to write proper audio app, even if license restrictions prevent from using
RAUL in a hypothetical commercial product years down the road, it may well
be worth me using for my own personal needs in the meantime.
Would love to hear feedback on technical merits of RAUL, minus license
conversations which now have my other thread. ;-)
heck I'd like to hear about any other libraries worth looking into too. I'm
trending at the moment to STK, embedded csound, Jack, and QT for gui. The
part I need library help with is likely synchronization and
interprocess/interthread communication. ( ie do I use the jack ringbuffer?
Do I look at boost queue implementations? does RAUL have a higher level
convenience ring buffer?
thanks!
iain
I found it on Dave's site, but other than that, couldn't find find much
mention of it. Do many people use it? Would it be wise to dig into RAUL for
writing a real time jack app?
Dave, any comments on it?
http://drobilla.net/software/raul/
thanks
iain
Can anyone point me at what they consider the best thing to look at for an
introduction to communication between threads in a jack app using the
ringbuffer?
I found some, but as docs appear a bit scattered, wondered if there was a
known best-first-reference type thing.
thanks
iain
this is totally prealpha-OMG-IT'S-FULL-OFF-FAIL-state..
https://github.com/fps/jiss
requirements: SWIG, libjack-dev and liblua5.1-dev on ubuntu..
compile with make (if it fails, you're on your own. it's a simple
makefile though). then run in the build dir:
lua wicked.lua
if you have jass running with a piano sample on midi channel 0, a bass
drum on channel 1 and a hihat on 2 you should get a rather weird
interpretation of "stella by starlight", a jazz standard..
something like this (some effects added with jack-rack):
http://shirkhan.dyndns.org/~tapas/stella.ogg
(wicked.lua code here with some omission of some chords at the start and
some remarks in comments added):
-- some stuff :D
require "jiss"
require "jissing"
-- create engine in stopped state
e = jiss.engine()
-- setup some state that the sequences later use
-- e:run can only be used when the engine is stopped..
-- as this is executed in non-RT context it's ok to
-- create some variables and tables here..
e:run([[
bar = 0;
min = 20;
max = 80;
stella = {
range(min, 80, min7b5(E(4))),
range(min, 80, min7b5(E(4))),
-- cut away quite a bit here (see wicked.lua in git clone) :D
range(min, 80, maj7s11(B(4)-1)),
range(min, 80, maj7s11(B(4)-1))
}
]])
-- this sequence can control the others since it's processed before
-- the others in the engine
-- events string is newline sensitive. in this case the events
-- on consecutive lines are spaced 1 second apart..
-- also: loop back to 0 at time t = 8 sec
tune = seq(e, "tune", loop_events(8, events_string(1, [[
drums1:relocate(0.0); drums1:start_(); notes:relocate(0.0);
notes:start_()
drums1:stop_();
]])))
-- manually start this sequence and add to the engine
tune:start()
-- note that a copy is appended to the engine
e:append(tune)
-- a sequence that controls the global variable bar to advance through
the song
play(e, seq(e, "control", loop_events(1, events_string(1, [[
bar = bar + 1; bar = (bar % #stella);
]]))))
-- events at fixed times. loop at t = 0.75 sec
play(e, seq(e, "notes",
loop_events(0.75, {
{ 0.125, [[ for i = 1,4 do note_on(0, 24 +
stella[bar][math.random(#stella[bar])], 30 + math.random()*64) end
]] },
{ 0.5, [[ for i = 1,2 do note_on(0, 24 +
stella[bar][math.random(#stella[bar])], 10 + math.random()*34) end ]] }
})))
-- a drum pattern
drums = [[
note_on(1, 64, 127); note_on(2, 64, 127)
note_on(2, 64, 127)
note_on(2, 64, math.random(127))
note_on(2, 64, math.random(127))
note_on(2, 42, 110)
note_on(2, 64, 127)
note_on(2, 64, math.random(127))
note_on(1, 64, 127); note_on(2, 64, 127)
note_on(2, 64, math.random(127))
]]
play(e, seq(e, "drums1", loop_events(1, events_string(0.125/2, drums))))
-- connect all sequence outputs to jass:in
connect(e,"jass:in")
-- run the whole thing
e:start()
-- wait for the user to press enter
io.stdin:read'*l'
Have fun,
Flo