Hi,
QMidiArp 0.5.2 has just seen the light of the day. It brings mainly
two improvements. One is a comeback, that of tempo changes on the fly,
and that now includes also tempo changes of a potential Jack Transport
master. Also the Jack Transport starting position is finally taken into
account, so that QMidiArp should be in sync also when starting the
transport master not at zero.
The second one is Non Session Manager support, mainly thanks to the work done by Roy Vegard Ovesen!
Note that for compiling in NSM support you will now need liblo as dependency.
Enjoy, and enjoy LAC in Graz this year
Frank
________________________________
QMidiArp is an advanced MIDI arpeggiator, programmable step sequencer and LFO.
Everything is on
http://qmidiarp.sourceforge.net
qmidiarp-0.5.2 (2013-05-09)
New Features
o Tempo changes are again possible while running, both manually or by
a Jack Transport Master
o Jack Transport position is now taken into account when starting,
QMidiArp used to start always at zero
o Muting and sequencer parameter changes can be deferred to pattern
end using a new toolbutton
o Modules in the Global Storage window have mute/defer buttons
o Global Storage location switches can be set to affect only the pattern
o Non Session Manager support with "switch" capability (thanks to
Roy Vegard Ovesen)
General Changes
o NSM support requires liblo development headers (liblo-dev package)
Anyone interested in beta-testing this please let me know.
Zita-njbridge
-------------
Command line Jack clients to transmit full quality
multichannel audio over a local IP network, with
adaptive resampling at the receiver.
Main features:
* One-to-one (UDP) or one-to-many (multicast).
* Sender and receiver(s) can each have their own
sample rate and period size.
* Up to 64 channels, 16 or 24 bit or float samples.
* Receiver(s) can select any combination of channels.
* Low latency, optional additional buffering.
* High quality jitter-free resampling.
* Graceful handling of xruns, skipped cycles, lost
packets and freewheeling.
* IP6 fully supported.
* Requires zita-resampler, no other dependencies.
Note that this version is meant for use on a *local*
network. It may work or not on the wider internet if
receiver(s) are configured for additional buffering,
and if you are lucky. The current code will replace
any gaps in the audio stream by silence, and does not
attempt to re-insert packets that arrive out of order.
You will need a fairly recent Jack version, as the
code uses jack_get_cycle_times() and no fallback for
that is provided.
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)
As refered to earlier, I am working on an app to use a standard computer
keyboard for DAW (or other) control. I have started with actkbd and am
moving from there. actkbd remains fully usable, but I have added things to
make it useful for control of DAW apps. My first attempt at jackd
interface was to allow keys to control the jack transport. The actions I
set up are:
roll
stop
zero
forward 1 sec (48kframes)
forward 10 seconds (480Kframes)
back 1 sec
back 10 sec
Because of how actkbd is set up, the key use for these is fully
configurable. I have been using the numeric keypad:
Enter = roll
0 = stop
+ = forward
+ repeat = fast forward
etc.
The repeat can be used or ignored. In the future I would like to set up
the forward and back so the user can configure the number of frames rather
than having just two choices, but I am more interested in proving the
concept first.
My next thing is to set up (jack)MIDI out (the port shows on the graph so
far :)
However, while testing this (with ardour as happens), I am wondering about
the merrits of using jack transport at all. That is, would it be better to
only use midi to control one application's control of the transport rather
than controling the transport directly? In this case the user would have
the option of which to use because they do not have to configure any keys
to send transport actions. But I am wondering if it would cause problems
that could be easily solved by not offering transport control at all.
Lots of things still to figure out:
- send unused keys to another system kb interface so X will grab it
(allows spliting the keyboard)
- output MIDI info on both jack and alsa.
- gracefully not use jack outputs if jack is not running
- detect jack showing up and start using jack outputs
- create some sample, but useful config files for those who don't want to
- create a GUI to make config files
So far, I have been using a second keyboard which is "grabbed" so tha X
doesn't see it.
--
Len Ovens
www.ovenwerks.net
Sorry about the posting on LAA, the reply should've been discarded but
instead I ticked accept :(
Anyhow, I'm hereby forwarding it to LAD because it fits better here I guess.
Bye,
Jeremy
-------- Original Message --------
Subject: Re: [LAA] jackwsmeter 1
Date: Mon, 14 Jul 2014 11:13:56 +0200
From: Guillaume Pellerin <lists(a)parisson.com>
To: linux-audio-announce(a)lists.linuxaudio.org
Hi Fred,
Great project!
I get this:
momo@wm22:~/dev/audio/jackwsmeter$ make
CC jackwsmeter-jackwsmeter.o
jackwsmeter.c: In function ‘callback_http’:
jackwsmeter.c:92:3: error: too few arguments to function
‘libwebsockets_serve_http_file’
In file included from jackwsmeter.c:41:0:
/usr/local/include/libwebsockets.h:1077:1: note: declared here
jackwsmeter.c:125:7: error: ‘LWS_CALLBACK_SET_MODE_POLL_FD’ undeclared
(first
use in this function)
jackwsmeter.c:125:7: note: each undeclared identifier is reported only
once for
each function it appears in
jackwsmeter.c:129:7: error: ‘LWS_CALLBACK_CLEAR_MODE_POLL_FD’ undeclared
(first
use in this function)
make: *** [jackwsmeter-jackwsmeter.o] Erreur 1
Cheers,
Guillaume
_______________________________________________
Linux-audio-announce mailing list
Linux-audio-announce(a)lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-announce
Hi,
i would like to announce some tools around OSC.
oschema: a format definition to describe OSC units
https://github.com/7890/oschema
oscdoc: create HTML documentation from oschema instances
https://github.com/7890/oscdoc
txl: a simplified text format that can be translated to XML (and vice versa)
https://github.com/7890/txl
Basic idea:
-Having a standardized, machine-readable format to describe an OSC API
-Derive "stuff" from description, like documentation, code skeletons etc.
-Let programs use OSC API dynamically by looking at definition
Proposed workflow:
-Write OSC API definition using txl (optional)
-Convert txl to XML (optional)
cat my.txl | txl2xml > my.xml
-Use post-processing chain for desired output (i.e. oscdoc)
oscdoc my.xml /tmp/mydoc
If i got your attention, please get a quick overview before cloning here:
http://lowres.ch/oschema/oschema.html (oschema documentation)
http://lowres.ch/oschema/oschema.svg (interactive structure)
http://lowres.ch/oscdoc/unit.txl (an example txl file describing an OSC unit)
http://lowres.ch/oscdoc/unit.xml (corresponding XML file, oschema instance
document)
http://lowres.ch/oscdoc/index.html (output of oscdoc)
Please let me know when you find anything unclear or missing.
Have nice day
Tom
Just looking through the midi messages used by the Mackie and other
controlers. It appears to me that it is designed to be used on it's own
physical midi channel as it uses (up to) all the logical midi channels (1
to 16) and so could not work running through a midi kb for example. It is
just the faders that do this using the pitch control on each channel, all
the switches, lamp signals and encoders are on channel 1 (or 0 if you
like) which would also conflict with some of the older KB like the DX7
which use only channel 1 also. Now obviously, if the controler is a USB
device, it creates it's own midi port anyway. I took a look at Ardour
(cause that is what I use to record) and it has a port dedicated to the
controler (and other things) so that is not a problem.
My question then is do all applications that can be controled by midi have
a dedicated port for control? (well most anyway The non-group of
applications accepts midi control of transport, but not the mixer which
wants midi converted to CV first... making it effectively not controlable
via a control surface without some sort of SW interface to take care of
banks etc.) Hmm, directly controling jack transport might be an idea.
--
Len Ovens
www.ovenwerks.net
Hello new to this list but realized I was asking a lot of development
questions in LAU that would probably be better here.
I am writing a program that takes some system input from the keyboard and
redirects it as midi. This is to allow:
- a second keyboard to be used as a midi control
- switching the system keyboard back and forth between normal and midi
controler
- using a potion of the system keyboard as a midi controler (the numeric
pad for example)
The advantage of this over keyboard short cuts is that this will work the
same no matter what the desktop focus may be.
I am (as many people do) writing this for my own use, but will make it
available for others who may have similar needs.
My question:
Where and how is the best place to put the MIDI port?
My first thought was to get it to create a jack midi port as this would be
the most direct. The problems I see with this are, 1) my program will run
as a different user than Jackd, 2) my program will run before and perhaps
after jackd.
SO then I thought I guess I have to use ALSA for my midi port. I need to
know if I do this, will jackd that is running a different backend than
alsa (firewire for example) still see the alsa midi ports? I am guessing
that a2jmidid would still work in any case, but not everyone uses that.
Assuming I have to use alsa midi (unless some one can suggest how to make
this work with jack midi), I have noticed that some aplications that use
alsa midi do not leave a visible port in alsa and do all connections
internally which would be useless in this case. There are some examples of
code at http://www.tldp.org/HOWTO/MIDI-HOWTO-9.html and I was wondering if
this method will leave visible midi ports. I am also noticing it 2004 ish,
is the info still valid, or is there a better place to look?
The keyboard input and grabbing will be based on the code from actkbd as
it seems to already do all that stuff and is gpl2 (and besides the code
looks understandable to me)
--
Len Ovens
www.ovenwerks.net
The Guitarix developers proudly present
Guitarix release 0.30.0
For the uninitiated, Guitarix is a tube amplifier simulation for
jack (Linux), with an additional mono and a stereo effect rack.
Guitarix includes a large list of plugins, support LADSPA plugs, and
now, new, as well LV2 plugs.
The guitarix engine is designed for LIVE usage, and feature ultra fast,
glitch and click free, preset switching, full Midi and/or remote
controllable (Web UI not included in the distributed tar ball).
Preset could organized in Banks, so a Bank could represent your "show".
If you have any issue with Preset loading/creating/handling, please read
here in our wiki how this stuff work in guitarix. It is straightforward,
and maybe more intuitive then you guess.
http://sourceforge.net/p/guitarix/wiki/EnhancedUI/#creating-presets
This release fix some bugs in our faust based zita-rev1 implementation,
add LV2 support to the guitarix racks and introduce some new plugs, GX
based and as well in the LV2 format.
new plugs:
* GxDetune (gx / LV2)
* Baxandall tonestack(gx)
* GxShimmizita ( LV2)
* GxSwitchedTremolo (LV2)
Please refer to our project page for more information:
http://guitarix.sourceforge.net/
Download Site:
http://sourceforge.net/projects/guitarix/
Forum:
http://guitarix.sourceforge.net/forum/
Please consider visiting our forum or leaving a message on
guitarix-developer(a)lists.sourceforge.net
<mailto:guitarix-developer@lists.sourceforge.net>
So it's summer, they say.
White bright and light pastel colors sparkling on every corner and turn.
Cheesy and silly season, they say. Alas, southerners don't apply. Sorry
about that. Of course I mean the hemisphere, obviously.
For whom it might concern, all anxiety has come to an end.
Indeed.
It all relates back to this last May 3, when a not-so-formal meeting
(aka. workshop) took place while during LAC2014@ZKM-Karlsruhe, where
some pertinent and undeniable requests were dodged and framed to a
"soonish" implementation. And guess what?
Yup, the "soonish" are no more, or so I think.
Qtractor 0.6.2 (boson walk beta) is out!
Perhaps an additional word is due though, about the riddling code-names
that are branding the post-TYOQA beta releases. They have no personal
nor logical sense, I assure you. Perfectly arbitrary now. Everything in
life and the universe is way more unconventional than just a name.
Without further assay.
Qtractor is an audio/MIDI multi-track sequencer application written in
C++ with the Qt4 framework. Target platform is Linux, where the Jack
Audio Connection Kit (JACK) for audio and the Advanced Linux Sound
Architecture (ALSA) for MIDI are the main infrastructures to evolve as a
fairly-featured Linux desktop audio workstation GUI, specially dedicated
to the personal home-studio.
Release highlights:
* Plugins activation MIDI controller / automation (NEW)
* LV2 UI Idle and Show (>= Qt5) interface support (NEW)
* Discrete editing of automation curve node velues (NEW)
* Missing audio/MIDI files and plugins warning message (NEW)
* MIDI note drawing on tempo-map changes (FIX)
* Automation curves re-adjusted to tempo-map changes (FIX)
Website:
http://qtractor.sourceforge.net
Project page:
http://sourceforge.net/projects/qtractor
Downloads:
http://sourceforge.net/projects/qtractor/files
- source tarball:
http://download.sourceforge.net/qtractor/qtractor-0.6.2.tar.gz
- source package (openSUSE 13.1):
http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.sr…
- binary packages (openSUSE 13.1):
http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.i5…http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.x8…
- quick start guide & user manual (severely outdated, see wiki):
http://download.sourceforge.net/qtractor/qtractor-0.5.x-user-manual.pdf
- wiki (help wanted!):
http://sourceforge.net/p/qtractor/wiki/
Weblog (upstream support):
http://www.rncbc.org
License:
Qtractor is free, open-source software, distributed under the terms
of the GNU General Public License (GPL) version 2 or later.
Change-log:
- Prevent linear and spline automation curve modes for all integer
valued subjects. Also, make sure those values are rounded to the nearest
integer away from zero.
- Fixed save of LV2 Presets for plugins with state files.
- A man page has beed added (making up Gürkan Sengün's work on debian,
thanks).
- When moving plugins by eg. drag-and-dropping across tracks, automation
curves were being left behind, maybe leading to unpredictable mistaken
behavior. Hopefully, not anymore.
- Translations install directory change.
- Automation curves are now automatically re-adjusted to tempo map node
changes (after a ticket by Holger Marzen, thanks).
- Audio/MIDI files or plugins found missing on session load are now
subject for an explicit modal warning message and prompt for an
immediate session backup salvage.
- Changing instrument plugin programs is now an undo/redo-able command
operation, especially for DSSI but also for plugins that come with the
LV2 Programs interface extension support
(http://kxstudio.sourceforge.net/ns/lv2ext/programs).
- Drawing, selecting and/or resizing of MIDI note events that extend
across tempo/time-signature changes is now made a bit more correctly
over the MIDI clip editor (aka. piano-roll), especially regarding to
current snap-to-beat setting (after an outstanding ticket by yubatake,
thanks).
- Once again, audio frame/MIDI time drift correction has been slightly
refactored to improve MIDI input monitor and timing.
- Discrete automation curve node values may now be edited via a
numerical entry floating spin-box on double-click (as yet another
request by AutoStatic aka. Jeremy Jongepier, thanks).
- Pressing shift/ctrl keyboard modifiers while double-clicking on a
plugin list entry now briefly reverses the current
View/Options.../Plugins/Editor/Open plugin's editor (GUI) by default
option preference.
- Fixed an old crash lurker when switching output buses that implied a
change on the number of audio channels, while on tracks that have
(auto-)monitor turned on and at least one active plugin in chain (yet
another ticket by AutoStatic aka. Jeremy Jongepier, thanks).
- MIDI Controller assignment (aka MIDI learn) and/or automation of
plugins (de)activation state has been added (as requested by AutoStatic
aka. Jeremy Jongepier, thanks).
- LV2 UI Idle and Show interfaces support added.
- Allow the build system to include an user specified LDFLAGS (patch by
Alessio Treglia aka. quadrispro, thanks).
See also:
http://www.rncbc.org/drupal/node/795
Enjoy && have (lots of) fun.
--
rncbc aka Rui Nuno Capela
rncbc(a)rncbc.org
On Wed, July 2, 2014 2:22 pm, Flávio Schiavoni wrote:
> Hi Patrick
>
> A first advice is to try some tool that works with multicast / broadcast
> addressing method to allow a one to many connection. It means to work
> with UDP because TCP can not do multicast or broadcast. So you can save
> some bandwidth. Since RTP is not a transport protocol but a kind of
> application protocol over UDP, a tool RTP based can be used. If I'm not
> wrong, Icecast works with TCP. I dunno if it can be configured.
>
Thanks for that tip. I am currently looking at ffserver with ffmpeg. IIUC
it can support RTP too so that might be a good way forward. I have it
running on my device and I am testing the stream/codec combinations at the
moment. Gotta hand it to the ffmpeg devs for keeping keeping pace with the
market.
> Some questions:
> - Do you need to sync audio / video / MIDI?
Not really sample accurate but 2000ms is the limit for lag.
> - What is your audio / video / MIDI source? File? Cam?
/dev/graphics/fb0 + external BT microphone
> - How will it be used on the receiver? Monitor? Projector?
If I use ffserver the output will be displayed as a video stream at the
application level.
>
> Pure Data can send audio / midi / video.
>
I will look into PD if ffserver is unable to get the job done.
> If I'm not wrong, GStreamer can do it too.
>
> Cheers
>
> Schiavoni
>
> Em 01-07-2014 06:34, Patrick Shirkey escreveu:
>> Hi,
>>
>> Does anyone have a suggestion for open source solutions to enable
>> streaming AV/midi to multiple ARM mobile devices with a one to many
>> network configuration?
>>
>> I am looking at the following options:
>>
>> 1: ffmpeg streaming server
>> 2: icecast with netjack
>> 3: netjack
>>
>> There are some limitations.
>>
>> 1: Server is a mobile device with dual core ARM chipset
>> 2: Wifi connectivity with 20Mb/s total uplink from master server.
>>
>> An ideal implementation would allow 20 client devices to receive the
>> audio
>> stream in close to realtime. Upto 100ms delay would be acceptable.
>>
>> I'm weighing up the benefits from using FFMPEG to stream all the data
>> compared to a 32/64bit icecast stream with additional midi triggering
>> for
>> visual data located on the client app.
>>
>> - FFMPEG has the benefit of removing all trigger events but costs a lot
>> in
>> terms of bandwidth/power consumption.
>>
>> - Icecast is very good at serving audio but iiuc does not support
>> video/midi
>>
>> - Netjack can potentially do all three but is not well tested on a
>> mobile
>> platform.
>>
>>
>>
>>
>> --
>> Patrick Shirkey
>> Boost Hardware Ltd
>> _______________________________________________
>> Linux-audio-dev mailing list
>> Linux-audio-dev(a)lists.linuxaudio.org
>> http://lists.linuxaudio.org/listinfo/linux-audio-dev
>
--
Patrick Shirkey
Boost Hardware Ltd