Hi everybody,
Fallowing up the long discussion i'm trying to sort of give the
information you all seem to be missing.
there was a meantion of this, but not too many of you have paid any
attention:
here you can find an article about the current status of the protocol mess:
http://prosoundnewseurope.com/pdf/PSNLive/PSNLive_2009.pdf (page 28)
obviously eas50 is good to go, but Ethernet AVB is right thing really.
the only thing that it's still work in progress, but many of the
proprietary vendors, which already have their own networking solution
(like Harman with HiQ-net, the one i can name of top of my head)
are involved in AVB stadard deveelopment.
The idea of AVB is to bypass the IP layer, which is right thing really.
you don't need to assign IPs to your audio nodes, really!
in avb you'd just have to select channels that nodes whant to listen to.
there is a fair bit of documentation on the ietf.org AVB group's page.
but XMOS is looking to be the best point of refference:
http://www.xmos.com/news/15-jun-2009/xmos-simplifies-ethernet-avb-implement…
is think we should forget everything else and crack on with the XS1 AVB
implementation!
their XS1 chips seem to be really great,
their are basically every innovative and open-source minded.
the official toolchain is LLVM-GCC based.
you can use C, C++ or their own XC.
XC is basically C with some stuff omited (like goto and floats)
and XMOS IO stuff added, don't just say WTF, look at it first!
you should also watch the videos here:
http://www.xmoslinkers.org/conference-online-wf
especialy the two about the "XMOS Architecture" and the AVB
presentation.
some dev-kits are quite expencive, but that's due to low-volume really
;)
there is alos a nice USB Audio kit!
plus there is alittle board that is cheap and has two RJ45's on it
already :)
I'm myself studying the XC book at the moment. And geting familiar with
the tool set :)~~
looks very exciting, cause these are the invovative chips!
ok, may be an FPU is really missing on XCore, but how many DSPs have
it anyway? well quite a few, but there was no FPU on dsps for ages! :))
also XC or C/C++ are so much more obvious then the bloody "menthal american military engeneers non-sense" called HDL-whatever!
Cheers Everyone,
Hope you will appreciate my excitment :)~ (l0l)
--
ilya .d
Hiho,
I managed to get SuperCollider and JACK running on my IGEP [1], on the pre-
configured Ubuntu on the SD, but of course the audio is still bumpy.
So... I'm looking for a RT kernel for this little machine... Anyone have any
pointers?
I did find this one: http://beagleboard.org/project/omap-rt-patch/
but the project doesn't show much recent activity.
and... I also noticed that JACK didn't want to run with alsa as backend; oss
as backend works though (but gives the bumpy sound).
Anyone willing to share their experiences doing linux audio on ARM processors?
sincerely,
Marije
[1] http://www.igep-platform.com/
Hello guys!
I would like to ask any of the developers who might be interested to help a
musician out!
Original Kluppe developer seems very busy these days. And his programm lacks
two things I really need for my musical work.
I am ready to pay for the work. All the details below.
*1. Feature 1 - random play.*
When on Windows, I wrote a simple programm for myself, called Tape Loops. It
could play sounds either once or in a loop. The special function which I
added to it was "random play". What it did was allow you to define a period
of time in seconds. And the programm would trigger the sound randomly
somewhere within this period. So, if the period is 20 seconds, the programm
would play the sound either in 5 seconds, or in 10 or in 7 or in 20. When
the sounds stops playing, the timer is reset and the program once more
chooses a random moment to trigger the sound within the given period. By
changing the period the musician can make the sound get triggered more often
or less often.
The demo of how this program works can be viewed here:
http://www.louigiverona.ru/?page=projects&s=software&t=tapeloops&a=tapeloop…
It is a great feature - it helps a lot an ambient composer like myself and
is greatly useful for installations.
On Linux I have tried some programming, but even setting up JACK is very
difficult for me, I am absolutely not a strong desktop programmer. So
writing something like that from scratch on Linux is not realistic for me. I
tried Kluppe, which is the closest thing, it is a great piece of software,
but I studied the code for several days, tried some things, but apart from
changing the colors, I do not seem to be able to do anything meaningful.
So I would like to ask someone to do this job for me. To add a timer to a
kluppe looper and to allow this "random play" mode, where the musician can
put a looper into random play mode and define the period.
*2. Feature 2 - basic midi control.*
Looking through Kluppe code, I saw that a lot of midi is already done, but
it is not "attached" to the controls. I might be wrong and there may be more
work than it seems, but anyway. I would want to be able to assign midi
control to triggering loops, volume and panning - at least that. Otherwise,
Kluppe is very difficult to use in a live performance.
However, instead of proposing to allow to create separate controls for each
looper like they have in SooperLooper, I would advice (and actually, ask for
this feature to be implemented in such a manner) to instead go for the
Selected looper scheme. So that one would not need a dozen of knobs to
control things. There should be an ability to have one "Selected" looper.
Similar to what Dj Traktor Studio has. So you are binding midi not to a
definite looper, but to the Selected looper, and thus you would require only
two knobs (vol, pan) and three buttons (play/stop, Prev looper, Next
looper).
*3. Payment.*
I understand that all of the above might be not as simple as it seems to me
now. I would be willing to pay, as much as I can. I am able to pay through
PayPal. I do not know how much money is a normal pay for such work, but I
think something can be arranged. If I am not able to pay up instantly, I am
willing to pay for several months in a row to cover the necessary expenses.
I will also ask around on forums if someone will join me and also donate
some money - while my random play function is probably too specific and is
only something for me, midi control in Kluppe is something I believe many
people would want.
Thank you for your attention and I hope someone gets interested in the
request!
Louigi Verona.
http://www.louigiverona.ru/
Hello everybody!
When using an LV2 synth within a sequencer, I was told that the host has no
way to access audio which is produced by the plugin so such a basic thing as
rendering your whole project to wav appears to be impossible.
I want to ask if this is true and whether this can somehow be changed. In my
opinion, if the above is true, this is a very serious limitation of LV2.
--
Louigi Verona
http://www.louigiverona.ru/
Greetings,
Martin Eastwood has posted the code for his MVerb:
http://martineastwood.com/
Open-source, GPL3'd free software.
Maybe someone could whip up a plugin or standalone app from this code ?
PS: If you download the zipfile note that it does not include a
top-level directory, i.e. it'll dump its contents into the current
directory.
Best,
dp
Hi,
this is all about making Linux Audio more useful.
The idea came about because on the one hand there are parts of Linux
audio that really need some coders attention and on the other hand there
are coders who don't know where to start. I realize that there never are
more than enough coders, so this is mainly about bringing attention to
the parts that need it the most.
To a degree it's what bug/feature trackers are there for, but those are
usually per application, and while there are category and priority
systems in place those are rarely used.
So what this is also about is bridging a gap between users, developers
and between applications.
It would be quite simple really.
An easy to find, central place, possibly a wiki or a tracker.
Anyone, a user most likely, describes his workflow and what the
showstopper is. This could be applications not syncing properly, or an
essential but missing feature. The idea is to tackle mainly
infrastructure and cross application problems, with the goal to make a
workflow actually work.
The user should have to specify all relevant information available, such
as version information, links, probably some kind of priority or urgency
indication and how hard he believes it would be.
He could also put up a reward of sorts, not necessarily monetary.
Any developer could pick up the task and work on it, possibly leaving a
notice.
The possible benefits I see are:
a) A kind of overview of what's needed the most, one place where you can
see what's actually important to users.
b) A way to identify and fix problems between applications - something I
believe is very important for a system that encourages the use of
multiple applications at once. I believe there are numerous
synchronisation/transport issues for example which are never really tackled,
despite this being a very important part of the infrastructure.
c) Emphasis on actual workflow and usability.
d) It would work for any program, even those without tracker and those
that aren't high profile and aren't usually in the center of attention.
Could this work? What do you think?
--
Regards,
Philipp
--
"Wir stehen selbst enttäuscht und sehn betroffen / Den Vorhang zu und alle Fragen offen." Bertolt Brecht, Der gute Mensch von Sezuan
Ichthyostega wrote:
> Ralf Mardorf schrieb:
>> Another stupid question induced by an argument regarding to MIDI jitter by
>> Daniel James.
>>
>>> [snip] I'm sceptical that the realtime kernel is the cause of your MIDI
>>> problems. If they got this right in the 80's, on computers which could not
>>> do anything near realtime audio processing, then I think it's more likely
>>> to be a question of MIDI application design.
>
> At that point we should call back, how that whole story with "realtime"
> started. At the begining was a design mismatch. Many things related to
> the Linux kernel started out with a kind of "I feel fine" pragmatism.
> Which, btw isn't to criticise as it is, because this also accounts
> for the freshness and sometime unconventional new approach to some
> problems. But with regards to timings, for all of the first decade
> of Linux development, there seemed to be a completely different
> mental model, which we could summarise as: permormance == throughput,
> and timings are only relevant, when you get a network timeout, or
> a sluggish response in your application's GUI.
>
> Thus, if we now consider to use a Linux kernel for making music, we must
> assess that the whole design isochronously assumed about 1000 times more
> headroom as there really is.
>
> Thus, as writing a new Kernel doesn't seem to be an option, this whole
> tedious undertaking of the "realtime patches" can be described as an
> attempt to fix this "problem" (which was never assumed to be a problem
> in the initial design) by hunting down one by one each individual instance
> where the existing kernel could possibly be reacting too slow.
>
> Thus, we should rather be surprised, how good these realtime kernels work.
> OTOH, is isn't a surprise the machines from the 80s meet these criteria;
> their OS software was written with an awareness for a much more limited
> processing capability right from start.
>
>
>> Why do people (not only me) report jitter for external MIDI equipment, but I
>> couldn't find any report for real-time audio jitter? Resp. what's about async
>> and disconnecting clients by JACK?
>
> Audio and MIDI are two quite different beasts.
> Sound is processed in Blocks, where the individual unity (1 Sample) is
> much more fine grained and way below anything which can be discerned by
> a human ear. Moreover, Sound as such already exists and 'just' has to
> be piped through. To the contrary, MIDI consists of events, which
> immediately trigger a reaction, which could be that a piece of software
> and at the same time a piece of external hardware starts a processing
> cycle. You see, thats a completely different situation and thus it's
> obvious, why for these two media the same problem causes so different
> symptoms.
>
>> OTOH on Windows audio clients don't disconnect,
>> just MIDI jitter is an issue too.
>
> IIRC, this was a design decision for JACK. It never tries to conceal
> any timeout problem, rather it requires its clients to keep up with
> a very tight schedule and comply to very strict rules.
>
> I don't know the MIDI part of Jack well enough to judge, if it was
> designed with the same "you're required to comply" policy. And besides,
> when the MIDI interface is hooked up via USB, we again face a completely
> different situation. USB is a complicated protocol, which multiple
> versions and levels and is certainly not designed to get an individual
> event transfered reliably with less than 2ms jitter.
> There is even the possibility that the USB peers negotiate to use a
> lower transfer rate or protocol version transparently, when they
> determine the connection can't keep up with the higher speed.
>
> Cheers
> Hermann V.
It's said that USB MIDI interfaces should be the better choice. But this
explains a lot. Dunno how to read Fons JACK MIDI jitter test, but ...
Subject: Re: [LAD] Again MIDI jitter - tested with Fons test applications
Date: Sat, 27 Mar 2010 18:26:33 +0100
From: Ralf Mardorf <ralf.mardorf(a)alice-dsl.net>
To: fons(a)kokkinizita.net
CC: linux-audio-dev(a)lists.linuxaudio.org
References: <4BADBD42.4030505(a)alice-dsl.net> <20100327164326.GD1545@zita2>
> Hi Fons :)
>
> fons(a)kokkinizita.net wrote:
> > On Sat, Mar 27, 2010 at 09:09:38AM +0100, Ralf Mardorf wrote:
> >
> >
> >> Regular it shifted between 2395 and 2404, but with a few exceptions,
> >> one time 2302, three times 2304, two times 2305 and two time 2494.
> >> See attachment.
> >> What might cause this exceptions? Could it be access to the RAM by
> >> the graphics? Is there something bad because of the IRQs?
> >>
> >> Regular shift 2404 - 2395 = 9 frames of jitter, exceptional maximal
> >> shift 2494 - 2302 = 192 frames of jitter.
> >>
> >> I guess this does mean ...
> >> 5.3 ms / 512 frames = 0.010351562 ms/frame
> >> Maximal difference for regular jitter 0.093164062 ms.
> >> Maximal difference for exceptional jitter 1.9875 ms.
> >> ... am I wrong?
> >>
> >
> > Wrong once or twice, if twice in such a way that the two
> > errors cancel out.
> >
> > First note that the test prints the difference between
> > events. That means that e.g. if *one* note is 100 samples
> > late you could see 2400 2500 2300 2400.
> >
> > The '2300' is just because the previous one was late,
> > not because this one arrives too early. So you should
> > divide the jitter as you measure it by two.
> >
>
> Aha, okay this is plausible.
>
> > Second, 5.33 ms = 256 frames at 48 kHz. But maybe you
> > are using 96 kHz ??
> >
>
> So you didn't read the attachment ;), yes I did use 96 KHz.
> [snip]
Subject: Again MIDI jitter - tested with Fons test applications
Date: Sat, 27 Mar 2010 09:09:38 +0100
From: Ralf Mardorf <ralf.mardorf(a)alice-dsl.net>
To: Linux Audio Developers <linux-audio-dev(a)lists.linuxaudio.org>
> When I once tested it by recording I got this result for ALSA MIDI on
> Linux, Cubase runs on Windows on the same machine:
>
> ||Cubase|HR tmr|System|PCM pl|PCM ca
> ------++------+------+------+------+------
> 500.0 || 493.0| 504.9| 505.6| 503.4| 503.2
> 1000.0|| 993.4|1005.4|1005.8|1005.3|1006.4
> 1500.0||1494.5|1503.6|1506.4|1507.4|1507.3
> 2000.0||1994.8|2003.8|2007.2|2007.9|2009.5
> 2500.0||2492.4|2504.1|2504.3|2503.6|2503.2
> 3000.0||2992.9|3006.0|3006.2|3005.9|3007.6
> 3500.0||3493.7|3502.7|3505.4|3506.5|3509.5
> 4000.0||3994.6|4003.1|4003.2|4008.8|4009.9
> msec +/- 0.1 msec
> maxDif|| 4.8| 6.0| 7.2| 8.8| 9.9
> minDif|| -2.4| -2.7| -3.2| -3.4| -3.2
> --------------+------+------+------+------
> Jitter|| 2.4| 3.3| 4.0| 5.4| 6.7
> msec +/- 0.2 msec
... as you can see, for Cubase I got this 2ms of jitter. So regarding to
your explanation Herman, Windows + ASIO + Cubase does a good job, just
the USB interface will limit it, while for Linux there seems to be
another issue too, but the USB interface. Btw. Linux HR tmr is a PITA,
just System, PCM pl and PCM ca are usable without issues for all Linux apps.
What could be the cause that Windows just is limited to the USB
interface by 2.4 ms, but Linux comes with 4.0 ms on my machine?
Joshua Boyd on LAD wrote:
> On Thu, Jun 17, 2010 at 10:37:25AM -0400, Gene Heskett wrote:
>
>>> At my school we transfered the CAD files per floppy to a DOS box that
>>> controlled the CNC machine, guess that's for the same reason, bad rt
>>> capabilities of newer OSes and machines.
>> The RTAI works pretty well, I can start a job, switch away from that window,
>> and talk to the guys on IRC, or browse the web without hurting the job.
>> That to me is true multitasking.
>
> So, that leaves me wondering why no one seems to be trying RTAI for
> audio work? Or is someone doing that and I'm just not aware?
Today I tried to do so.
I tried to run JACK2 with -R switch by user and by sudo, the result was
the same as here, when I launched JACK2 without -R switch on 64 Studio
3.0 beta based on Ubuntu Hardy:
$ uname -r
2.6.24-16-rtai
$ jackd -dalsa -dhw:0 -r96000 -p512 -n2
jackdmp 1.9.3
Copyright 2001-2005 Paul Davis and others.
Copyright 2004-2009 Grame.
jackdmp comes with ABSOLUTELY NO WARRANTY
This is free software, and you are welcome to redistribute it
under certain conditions; see the file COPYING for details
JACK server starting in non-realtime mode
creating alsa driver ... hw:0|hw:0|512|2|96000|0|0|nomon|swmeter|-|32bit
control open "hw:0" (No such file or directory)
Cannot initialize driver
no message buffer overruns
JackServer::Open() failed with -1
Failed to start server
ALSA seq couldn't start too.
I run the EMC2 / HAL latency-test:
Servo thread (1.0 ms): max interval 999180 ns, max jitter 10949 ns, last
interval 992259 ns
Base thread (25.0 us): max interval 34551 ns, max jitter 9640 ns, last
interval 24887 ns
The same test couldn't be used for my kernel-rt:
$ uname -r
2.6.31.12-rt20
$ latency-test
insmod: can't read '/usr/realtime-2.6.31.12-rt20/modules/rtai_hal.ko':
No such file or directory
RTAPI: ERROR: could not open shared memory (errno=2)
HAL: ERROR: rtapi init failed
halcmd: hal_init() failed: -9
NOTE: 'rtapi' kernel module must be loaded
RTAPI: ERROR: could not open shared memory (errno=2)
HAL: ERROR: rtapi init failed
halcmd: hal_init() failed: -9
NOTE: 'rtapi' kernel module must be loaded
RTAPI: ERROR: could not open shared memory (errno=2)
HAL: ERROR: rtapi init failed
halcmd: hal_init() failed: -9
NOTE: 'rtapi' kernel module must be loaded
ERROR: Module hal_lib does not exist in /proc/modules
ERROR: Module rtapi does not exist in /proc/modules
ERROR: Module rtai_math does not exist in /proc/modules
ERROR: Module rtai_sem does not exist in /proc/modules
ERROR: Module rtai_fifos does not exist in /proc/modules
/usr/bin/emc_module_helper: Invalid usage with args: remove rtai_ksched
[snip]
ERROR: Module rtai_hal does not exist in /proc/modules
Btw. should I commend out the EMC2 memlock when doing audio work again
or doesn't have it an impact?
$ cat /etc/security/limits.conf | grep memlock
# - memlock - max locked-in-memory address space (KB)
@audio - memlock unlimited
# @audio - memlock 2000000
* hard memlock 20480 #EMC2
Cheers!
Ralf
PS: What now? It's my second hardware set up. I just bought a new
computer a long time ago, because the old computer wasn't ok for Linux
too, but I don't have the money to pay for one machine after the other,
until I might have good luck. Both machines are 100% stable for Windows,
for Linux I also get asyncs + distortion when using JACK2. I didn't test
if current JACK1 is ok, or if it would disconnect clients. Don't get me
wrong, I never was a private Windows user, it just were installs for
testings. I'm using Linux only at home.
Hello all,
A few days ago someone posted a pointer to some
python/numpy audio related code. I wanted to keep
this message and have a look at it later, but my
fingers were too quick... And I don't find that
message in the archives (it could be on LAU as
well).
If anyone still has that message, please send me
a copy !
TIA,
--
FA
O tu, che porte, correndo si ?
E guerra e morte !
Cheers for the reply & the commit!
Its a help anyway, I was at 12 errors, now down to 6 errors. (+ 5 warnings).
All the errors seem to be of the same kind. (Something due to accessing
array[2] ?)
I've never written a .vapi and dont really understand all the fancy tricks
you do...
So best job is probably to post the error output after running the command..
command: valac --pkg jack main.vala // my main.vala is just a single
printf statement, no JACK code
jack.vapi:48.36-48.44: error: syntax error, no expression allowed between
array brackets
public int get_aliases(ref string[2] aliases);
^^^^^^^^^
jack.vapi:48.36-48.44: error: syntax error, no expression allowed between
array brackets
public int get_aliases(ref string[2] aliases);
^^^^^^^^^
jack.vapi:270.41-270.47: error: syntax error, no expression allowed between
array brackets
public void get_read_vector(ref Data[2] vec);
^^^^^^^
jack.vapi:270.41-270.47: error: syntax error, no expression allowed between
array brackets
public void get_read_vector(ref Data[2] vec);
^^^^^^^
jack.vapi:271.42-271.48: error: syntax error, no expression allowed between
array brackets
public void get_write_vector(ref Data[2] vec);
^^^^^^^
jack.vapi:271.42-271.48: error: syntax error, no expression allowed between
array brackets
public void get_write_vector(ref Data[2] vec);
^^^^^^^
On Sat, Jun 26, 2010 at 10:25 PM, alberto colombo <vala(a)albx79.it> wrote:
> Hello,
>
> I created a bug to track progress on Jack bindings:
> https://bugzilla.gnome.org/show_bug.cgi?id=576777. I have just committed
> the latest version I had on my HD (dated November 2009), which should
> fix quite a few bugs compared to the version in that email.
>
> Please note that I didn't have the means or the time to test all the
> bindings, therefore expect some memory errors on some functions I
> haven't used. It gives plenty of warning now (it should be updated for
> vala 0.9), but it should compile.
>
> Also, the binding are for version 0.116. If you need features from
> 0.118, they'll have to be added.
>
> Regards
> Alberto
>
> On Sat, 2010-06-26 at 16:04 +0100, Harry Van Haaren wrote:
> > Hey all,
> >
> > I'm learning Vala at the moment, primarily with the intention to develop
> > apps
> > which support JACK audio output. I'm gonna work towards GStreamer based
> > audio routing, and JACK Transport to Start/Stop my app.
> >
> > For the while, I hope to have a play around with JACK using Vala. So I
> went
> > looking for bindings and found the following:
> > http://www.mail-archive.com/vala-list@gnome.org/msg02266.html
> >
> > I've read the replies, installed Jack.vapi into /usr/share/vala/vapi
> > but I cant seem to get rid of some certain errors whenever I compile.
> >
> > Are others using this binding successfully?
> > Any tips / info I should know about when using Jack & Vala?
> > Cheers, -Harry
> > _______________________________________________
> > vala-list mailing list
> > vala-list(a)gnome.org
> > http://mail.gnome.org/mailman/listinfo/vala-list
>
>
>