I wrote:
> Subject: Re: [LAD] help on creating MIDI from linux input events
> Date: Sun, 05 Jul 2009 13:11:35 +0200
> From: Ralf Mardorf
> To: Renato Budinich
> CC: linux-audio-dev(a)lists.linuxaudio.org
>
>
>
> Renato Budinich wrote:
> > Thank you very much, this is interesting, I'll have a look into RtMidi
> > and your program. Being that the little I know is C, is it much more
> > difficult outputting notes using the alsa api?
> >
> >
> >
> >> Since there is a driver for it, you should be able to use it as an
> >> alsa midi device.
> >>
> > uhm, i'm not sure... the driver (snd_usb_caiaq) actually takes care of
> > the soundcard built in the pedal, and on the way also makes the buttons
> > create keyboard events, and the pedal EV_ABS
> >
>
> Hi :)
>
> I'm not a Linux, neither a C/C++ coder, but the answer here is very
> simple, if the MIDI events (I guess this is what you mean by keyboard
> events) are recognized by Linux, then there seems to be a driver for
> your equipment that's good enough for your needs :). So go on, write a
> simple Linux MIDI tool in C/C++, make it open source and reach in
And English also isn't my language ;). Maybe you like to 'hand in' a link.
> the
> link, maybe some Dino-Assembler-Beings for some microchips (like me)
> will be able to understand how to program for Linux too, if there would
> be some very simple examples, excepted of "Hello world" ;).
>
> Good luck,
> Ralf
On Sat, 2009-07-04 at 19:12 -0400, Mark Vitek wrote:
> Hello,
>
> I am new to audio/midi programming, but in my experience Jack is the
> easier API to learn/use.
> If you download midimon from sourceforce, and look in mm_jack.cc /
> mm_jack.h you can see the basic usage.
> Basically, you pass a callback function to jack, which allows you to
> pull available MIDI events from the jack API.
> I just queued those up using a Glib queue, and pull the events off in
> the user interface thread. Very straightforward.
> http://sourceforge.net/projects/midimon/
Thank you, I'll have a look in this.
What I can't understand is if it is possible, with the jack API, to
actually *create* midi note on/off messages. The examples I've seen
(like midimon too, it seems to me) only handle (reroute etc.) midi
coming from other sources, like an external keyboard or another program.
In my case instead I'll have to actually create those messages, as for
example midi sequencers do.
May this is the reason why programs that generate midi (seq24 for
example) use alsa? (see my confusion in point 1) in first message)
Renato
Thorsten Wilms schrieb:
> On Fri, 2009-07-03 at 23:19 +0200, Ulrich Lorenz Schlüter wrote:
>
>
>> I proudly released version 0.1 of uli-plugins, a collection of LV2 plug
>> ins. ULI is the abbreviation for *U*lis-*L*v2-*I*nserts.
>>
>
> Hi!
>
> ./waf configure failed to find lv2-plugin. I had to comment/uncomment
> lines in wscript to make it look for lv2core instead ... so you seem to
> know about this problem in principle? ;)
>
> But now I get:
> ../gates/and2i.cpp:1:25: error: lv2plugin.hpp: No such file or directory
>
Hi Thorsten,
on my Gentoo the lib lv2-plugin comes with a package named
lv2-c++-tools. The missing header also belongs to that package.
Everything should be fine after installing.
Uli
Robert Keller wrote:
>
> On Jun 11, 2009, at 5:19 AM, Grammostola Rosea wrote:
>
>> lasconic wrote:
>>> I took some time yesterday night to take a look to improvisor code and
>>> estimate the cost of adding musicXML export. Import is indeed more
>>> complicated.
>>> I downloaded the code of improvisor 3.39. It's the last and only code
>>> available. Improvisor inner model is a little bit different than
>>> musicXML
>>> one. Common practice in musicXML is to don't "time" the chords and
>>> put them
>>> in the middle of notes. At least, this is my experience with finale
>>> musicXML
>>> export features. I managed to make a quick and dirty prototype to
>>> export a simple melody (no
>>> tuplet) and chord root and bass (no extension yet). Chords are in
>>> between
>>> notes but lily+musicML2ly shoud be able to deal with it.
>>> Unfortunately, 3.39
>>> is an old version, and according to Bob Keller the code base changed
>>> a lot
>>> but it's not public yet. With some more voices, perhaps we can
>>> convince Bob Keller and his team to
>>> open up the repository to the public. After all, improvisor is a
>>> fine piece
>>> of software which can benefit from open development, moreover if
>>> time and
>>> resources are an issue.
>>>
>>> Lasconic
>>>
>>>
>> Thanks man. I'll forward this to Bob Keller too.
>> I think he mentioned in a message that he is willing to give
>> developers svn access to the recent code.
>>
>> Bob, could you comment on this?
>>
>> Kind regards,
>>
>> \r
>>
>
> I'll be looking toward moving Impro-Visor to a public repository, as
> soon as I stabilize the current version, which I hope will be before
> the end of June. Is SourceForge the best bet?
>
> Thanks.
>
> Bob
>
> Robert Keller
> Csilla & Walt Foley Professor
> Computer Science
> Harvey Mudd College
>
>
>
>
>
Hey Bob,
How are things going? New version almost ready? And is the source public
on SourceForge yet? Looking forward to it.
Kind regards,
Roos
Sorry for the crossposting, but the statistics show that only some people have subscribed to all three lists.
------------
Just after one month the Denemo project has released a new version of its music notation program. Denemo 0.8.6, which is available for Windows, Linux and MacOS (via third-party builds) as source and binaries. The software is distributed under the GPL. Denemos notation-functionality is ready for daily and professional use and aims to be the only tool you ever need for notation and an Open Source alternative to Finale, Sibelius or other unfree software, because the tools for producing art and culture should be free.
Notable new features are
- Downloading new commands and edit scripts between releases
- MIDI out, Tempo and Volume changes and insertion of arbitrary MIDI messages at any point in the music.
- Edit lyrics in text editor and see the syllable placement as you type. Multiple verses per voice allowed.
- Pasting LilyPond text directly into the Denemo window. By pasting the actual music text a Denemo editable score can be created from almost any LilyPond file.
- With JACK, the playback starts from the cursor or plays back the selection if there is one. All this happens withoutre-creating the MIDI data, and in any case without generating external files.
Official support, beneath our website, is avaible via our IRC channel #denemo on irc.freenode.net.
For future improvements our team searches for additional developers. If you are interested in notation and midi-sequencing please join the team!
Website: http://www.denemo.org
Additional information:
GNU Denemo is a free, GPL, open source music notation editor for Linux, MacOS and Windows that lets you rapidly enter notation for typesetting via the LilyPond music engraver. You can compose, transcribe, arrange, listen to the music and much more. Music can be typed in at the PC-Keyboard, or played in via MIDI controller, or input acoustically into a microphone plugged into your computer's soundcard.
Ken Restivo <ken(a)restivo.org> writes:
> Now all we need is a parametric EQ that allows one to click on the
> curves and put dots ... [snip] ...
It is planned but I dont consider it that important so it will probably
not happen (soon) unless somewhat with pygtk and cairo skills
contributes.
--
Nedko Arnaudov <GnuPG KeyID: DE1716B0>
Heya,
Just a quick announcement:
I just moved into Fedora Rawhide a little daemon called "RealtimeKit"
which will be enabled by default, and since it is now a dependency of
PulseAudio and things work how they work this will then not only be
available in Fedora 12 but also sooner or later in the other
distributions as well, installed by default.
So what does this do? It's a simple policy daemon that hands out
SCHED_RR scheduling to normal user processes/threads that ask for it.
So what's so fancy about it? Nothing. Except that is hopefully a good
solution for handing out RT scheduling and is also actually secure.
What's wrong with using RLIMIT_RPRIO? The simple fact that we cannot
enable that by default since it basically empowers the user to freeze
the machine. Also, asking the user to edit /etc/security/limits.conf
is certainly not user-friendly. We want to enable RT scheduling for
media aplications out-of-the-box.
But what's wrong with relying on RLIMIT_RTTIME? Being a process limit
it can very easily be circumvented for freezing the machine by
combining an RT busy loop with a fork bomb.
But what's wrong with relying on on a canary watchdog to avoid
freezing systems? It's racy: an evildoer could fork more quickly than
the canary watchdog daemon could demote its children. So a canary is
not really a protection against a frozen system.
Why not use cgroups for this? Because it's simply a horrible API, and
using this for media applications has non-obvious consequences on
using cgroups for their originally intended purpose -- which are
containers.
So what does RealtimeKit do that previous solutions didn't do? rtkit
relies on a new kernel feature SCHED_RESET_ON_FORK that got recently
merged into Ingo's tree and will hence shortly appear in 2.6.31. You
can set that flag when entering SCHED_RR scheduling and this will then
make sure that after forking a child will be reset to
SCHED_OTHER. RT fork bombs can thus be made impossible: if we hand out
RT to a process we can be sure it won't "leak", and if we decide to
take it away again we can be sure we can do that without having to be
afraid of races around forking.
rtkit enforces limits on the the number of threads/processes/users
that get RT sched. It also does rate limiting, and calls into
PolicyKit before handing out RT. Finally, as extra a-posteriori
protection it also includes a canary watchdog.
So what does that mean for you?
If you don't do RT development or doing RT development only for
embedded cases, or if you are a Gentoo-Build-It-All-Myself-Because-It-Is-So-Much-Faster-And-Need-To-Reinvent-The-Wheel-Daily-And-Configurating-Things-Is-Awesome-Guy
then it doesn't mean anything for you.
However, if you are a desktop developer interested to get your stuff
working out-of-the-box on modern distributions then you should think
about calling into RealtimeKit for acquiring RT scheduling.
RealtimeKit has a trivial API, to make a thread SCHED_RR it's just one
D-Bus method you need to call. You can either code that call yourself
or alternatively just copy the reference client implementation rtkit
includes into your sources:
http://git.0pointer.de/?p=rtkit.git;a=blob;f=rtkit.hhttp://git.0pointer.de/?p=rtkit.git;a=blob;f=rtkit.c
For more information see this:
http://git.0pointer.de/?p=rtkit.git;a=blob;f=README
So yepp, it would be great if folks would adopt this in their apps, so
that a user doesn't need to know about all those Unix intricacies such
as resource limits and so on, but still get good perfomance in his
media applications by default.
This is now in Fedora Rawhide which will still take a few months to be
released as F12. The other distros probably need a bit more time for
this. This means this is not a burning issue yet, so this is mostly
intended as a heads-up right now. Unless of course you are one of
those cool dudes who are living on the bleeding edge.
Packagers, you might want to steal this .spec file for you work:
http://cvs.fedoraproject.org/viewvc/devel/rtkit/rtkit.spec?revision=1.1&vie…
Questions?
Lennart
--
Lennart Poettering Red Hat, Inc.
lennart [at] poettering [dot] net
http://0pointer.net/lennart/ GnuPG 0x1A015CC4
Hi,
I've been working on a modular synth builder for a while and I've just
put the first release up on SourceForge:
http://sourceforge.net/project/showfiles.php?group_id=262459&package_id=322…
(It requires slv2, GooCanvas 0.14, and the ladspa-swh-plugins if you
want to build it.)
There are also some screenshots here (the second one shows the created
synth in use):
https://sourceforge.net/project/screenshots.php?group_id=262459
This is all pre-alpha stuff so don't expect it to work that well yet (or
even compile easily!). It can only create fairly basic synths at
present.
I could do with some help checking over the basic architecture and
prioritising future work, so join the mailing list on sourceforge if you
want to get involved.
Damon
> > Windows (i.e. "%ProgramFiles%\LADSPA Plugins\rdf"). ?Not that I
> > generally build with LRDF support on Windows anyway.
>
> This makes sense to me, if only it could be made "official"...
> (windows paths too)
Windows has official rules for this. Users are no longer allowed to add
random files to an application's directory in "/Program Files/Appname".
Permissions are set to prevent it and that directory is virtualized to
prevent it.
The exception is - Applications can install their own bundled plugins in
"/Program Files/Appname" at install time only (because the installer has
elevated privilege).
User installed plugins go in CSIDL_PROGRAM_FILES_COMMON , e.g. typically
"C:\Program Files\Common Files\LADSPA Plugins..."
Best Regards,
Jeff McClintock
guitarix is a simple Linux Rock Guitar amplifier and is designed to achieve nice thrash/metal/rock/blues guitar sounds.
Guitarix uses the Jack Audio Connection Kit as its audio backend and brings in one input and two output ports to the jack graph.
Release 0.04.6-1 comes with some major changes:
* Build environment and source code changes:
- use of the python based waf build system.
- use of the boost library for command line options.
- various code cleanups and source tree restructuring
All this has been done by our new project member James Warden.
* Audio effect and modeling:
- new tube model
- fuzz
Please read the README for more details regarding the new
build process and the command line options.
have fun
________________________________________________________________________
The standalone version of guitarix is based on GTK2+. But guitarix is also released as a suite of LADSPA plugins and can be used in e.g. ardour.
guitarix is licensed under the GPL.
Project page with screenshots:
http://guitarix.sourceforge.net/
download:
http://sourceforge.net/projects/guitarix/
For capture, guitarix uses the external application 'jack_capture' (version >= 0.9.30) written by Kjetil
S. Matheussen. If you don't have it installed, you can look here:
http://old.notam02.no/arkiv/src/?M=D
For extra Impulse Responses, guitarix uses the convolution application 'jconv' created by Fons Adriaensen. If
you don't have it installed, you can look here:
http://www.kokkinizita.net/linuxaudio/index.html
I(hermann) use faust to build the prototype and will say
thanks to
: Julius Smith
http://ccrma.stanford.edu/realsimple/faust/
: Albert Graef
http://www.musikwissenschaft.uni-mainz.de/~ag/ag.html
: Yann Orlary
http://faust.grame.fr/
regards hermann & James