Hi,
i would like to announce some tools around OSC.
oschema: a format definition to describe OSC units
https://github.com/7890/oschema
oscdoc: create HTML documentation from oschema instances
https://github.com/7890/oscdoc
txl: a simplified text format that can be translated to XML (and vice versa)
https://github.com/7890/txl
Basic idea:
-Having a standardized, machine-readable format to describe an OSC API
-Derive "stuff" from description, like documentation, code skeletons etc.
-Let programs use OSC API dynamically by looking at definition
Proposed workflow:
-Write OSC API definition using txl (optional)
-Convert txl to XML (optional)
cat my.txl | txl2xml > my.xml
-Use post-processing chain for desired output (i.e. oscdoc)
oscdoc my.xml /tmp/mydoc
If i got your attention, please get a quick overview before cloning here:
http://lowres.ch/oschema/oschema.html (oschema documentation)
http://lowres.ch/oschema/oschema.svg (interactive structure)
http://lowres.ch/oscdoc/unit.txl (an example txl file describing an OSC unit)
http://lowres.ch/oscdoc/unit.xml (corresponding XML file, oschema instance
document)
http://lowres.ch/oscdoc/index.html (output of oscdoc)
Please let me know when you find anything unclear or missing.
Have nice day
Tom
Just looking through the midi messages used by the Mackie and other
controlers. It appears to me that it is designed to be used on it's own
physical midi channel as it uses (up to) all the logical midi channels (1
to 16) and so could not work running through a midi kb for example. It is
just the faders that do this using the pitch control on each channel, all
the switches, lamp signals and encoders are on channel 1 (or 0 if you
like) which would also conflict with some of the older KB like the DX7
which use only channel 1 also. Now obviously, if the controler is a USB
device, it creates it's own midi port anyway. I took a look at Ardour
(cause that is what I use to record) and it has a port dedicated to the
controler (and other things) so that is not a problem.
My question then is do all applications that can be controled by midi have
a dedicated port for control? (well most anyway The non-group of
applications accepts midi control of transport, but not the mixer which
wants midi converted to CV first... making it effectively not controlable
via a control surface without some sort of SW interface to take care of
banks etc.) Hmm, directly controling jack transport might be an idea.
--
Len Ovens
www.ovenwerks.net
Hello new to this list but realized I was asking a lot of development
questions in LAU that would probably be better here.
I am writing a program that takes some system input from the keyboard and
redirects it as midi. This is to allow:
- a second keyboard to be used as a midi control
- switching the system keyboard back and forth between normal and midi
controler
- using a potion of the system keyboard as a midi controler (the numeric
pad for example)
The advantage of this over keyboard short cuts is that this will work the
same no matter what the desktop focus may be.
I am (as many people do) writing this for my own use, but will make it
available for others who may have similar needs.
My question:
Where and how is the best place to put the MIDI port?
My first thought was to get it to create a jack midi port as this would be
the most direct. The problems I see with this are, 1) my program will run
as a different user than Jackd, 2) my program will run before and perhaps
after jackd.
SO then I thought I guess I have to use ALSA for my midi port. I need to
know if I do this, will jackd that is running a different backend than
alsa (firewire for example) still see the alsa midi ports? I am guessing
that a2jmidid would still work in any case, but not everyone uses that.
Assuming I have to use alsa midi (unless some one can suggest how to make
this work with jack midi), I have noticed that some aplications that use
alsa midi do not leave a visible port in alsa and do all connections
internally which would be useless in this case. There are some examples of
code at http://www.tldp.org/HOWTO/MIDI-HOWTO-9.html and I was wondering if
this method will leave visible midi ports. I am also noticing it 2004 ish,
is the info still valid, or is there a better place to look?
The keyboard input and grabbing will be based on the code from actkbd as
it seems to already do all that stuff and is gpl2 (and besides the code
looks understandable to me)
--
Len Ovens
www.ovenwerks.net
The Guitarix developers proudly present
Guitarix release 0.30.0
For the uninitiated, Guitarix is a tube amplifier simulation for
jack (Linux), with an additional mono and a stereo effect rack.
Guitarix includes a large list of plugins, support LADSPA plugs, and
now, new, as well LV2 plugs.
The guitarix engine is designed for LIVE usage, and feature ultra fast,
glitch and click free, preset switching, full Midi and/or remote
controllable (Web UI not included in the distributed tar ball).
Preset could organized in Banks, so a Bank could represent your "show".
If you have any issue with Preset loading/creating/handling, please read
here in our wiki how this stuff work in guitarix. It is straightforward,
and maybe more intuitive then you guess.
http://sourceforge.net/p/guitarix/wiki/EnhancedUI/#creating-presets
This release fix some bugs in our faust based zita-rev1 implementation,
add LV2 support to the guitarix racks and introduce some new plugs, GX
based and as well in the LV2 format.
new plugs:
* GxDetune (gx / LV2)
* Baxandall tonestack(gx)
* GxShimmizita ( LV2)
* GxSwitchedTremolo (LV2)
Please refer to our project page for more information:
http://guitarix.sourceforge.net/
Download Site:
http://sourceforge.net/projects/guitarix/
Forum:
http://guitarix.sourceforge.net/forum/
Please consider visiting our forum or leaving a message on
guitarix-developer(a)lists.sourceforge.net
<mailto:guitarix-developer@lists.sourceforge.net>
So it's summer, they say.
White bright and light pastel colors sparkling on every corner and turn.
Cheesy and silly season, they say. Alas, southerners don't apply. Sorry
about that. Of course I mean the hemisphere, obviously.
For whom it might concern, all anxiety has come to an end.
Indeed.
It all relates back to this last May 3, when a not-so-formal meeting
(aka. workshop) took place while during LAC2014@ZKM-Karlsruhe, where
some pertinent and undeniable requests were dodged and framed to a
"soonish" implementation. And guess what?
Yup, the "soonish" are no more, or so I think.
Qtractor 0.6.2 (boson walk beta) is out!
Perhaps an additional word is due though, about the riddling code-names
that are branding the post-TYOQA beta releases. They have no personal
nor logical sense, I assure you. Perfectly arbitrary now. Everything in
life and the universe is way more unconventional than just a name.
Without further assay.
Qtractor is an audio/MIDI multi-track sequencer application written in
C++ with the Qt4 framework. Target platform is Linux, where the Jack
Audio Connection Kit (JACK) for audio and the Advanced Linux Sound
Architecture (ALSA) for MIDI are the main infrastructures to evolve as a
fairly-featured Linux desktop audio workstation GUI, specially dedicated
to the personal home-studio.
Release highlights:
* Plugins activation MIDI controller / automation (NEW)
* LV2 UI Idle and Show (>= Qt5) interface support (NEW)
* Discrete editing of automation curve node velues (NEW)
* Missing audio/MIDI files and plugins warning message (NEW)
* MIDI note drawing on tempo-map changes (FIX)
* Automation curves re-adjusted to tempo-map changes (FIX)
Website:
http://qtractor.sourceforge.net
Project page:
http://sourceforge.net/projects/qtractor
Downloads:
http://sourceforge.net/projects/qtractor/files
- source tarball:
http://download.sourceforge.net/qtractor/qtractor-0.6.2.tar.gz
- source package (openSUSE 13.1):
http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.sr…
- binary packages (openSUSE 13.1):
http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.i5…http://download.sourceforge.net/qtractor/qtractor-0.6.2-12.rncbc.suse131.x8…
- quick start guide & user manual (severely outdated, see wiki):
http://download.sourceforge.net/qtractor/qtractor-0.5.x-user-manual.pdf
- wiki (help wanted!):
http://sourceforge.net/p/qtractor/wiki/
Weblog (upstream support):
http://www.rncbc.org
License:
Qtractor is free, open-source software, distributed under the terms
of the GNU General Public License (GPL) version 2 or later.
Change-log:
- Prevent linear and spline automation curve modes for all integer
valued subjects. Also, make sure those values are rounded to the nearest
integer away from zero.
- Fixed save of LV2 Presets for plugins with state files.
- A man page has beed added (making up Gürkan Sengün's work on debian,
thanks).
- When moving plugins by eg. drag-and-dropping across tracks, automation
curves were being left behind, maybe leading to unpredictable mistaken
behavior. Hopefully, not anymore.
- Translations install directory change.
- Automation curves are now automatically re-adjusted to tempo map node
changes (after a ticket by Holger Marzen, thanks).
- Audio/MIDI files or plugins found missing on session load are now
subject for an explicit modal warning message and prompt for an
immediate session backup salvage.
- Changing instrument plugin programs is now an undo/redo-able command
operation, especially for DSSI but also for plugins that come with the
LV2 Programs interface extension support
(http://kxstudio.sourceforge.net/ns/lv2ext/programs).
- Drawing, selecting and/or resizing of MIDI note events that extend
across tempo/time-signature changes is now made a bit more correctly
over the MIDI clip editor (aka. piano-roll), especially regarding to
current snap-to-beat setting (after an outstanding ticket by yubatake,
thanks).
- Once again, audio frame/MIDI time drift correction has been slightly
refactored to improve MIDI input monitor and timing.
- Discrete automation curve node values may now be edited via a
numerical entry floating spin-box on double-click (as yet another
request by AutoStatic aka. Jeremy Jongepier, thanks).
- Pressing shift/ctrl keyboard modifiers while double-clicking on a
plugin list entry now briefly reverses the current
View/Options.../Plugins/Editor/Open plugin's editor (GUI) by default
option preference.
- Fixed an old crash lurker when switching output buses that implied a
change on the number of audio channels, while on tracks that have
(auto-)monitor turned on and at least one active plugin in chain (yet
another ticket by AutoStatic aka. Jeremy Jongepier, thanks).
- MIDI Controller assignment (aka MIDI learn) and/or automation of
plugins (de)activation state has been added (as requested by AutoStatic
aka. Jeremy Jongepier, thanks).
- LV2 UI Idle and Show interfaces support added.
- Allow the build system to include an user specified LDFLAGS (patch by
Alessio Treglia aka. quadrispro, thanks).
See also:
http://www.rncbc.org/drupal/node/795
Enjoy && have (lots of) fun.
--
rncbc aka Rui Nuno Capela
rncbc(a)rncbc.org
On Wed, July 2, 2014 2:22 pm, Flávio Schiavoni wrote:
> Hi Patrick
>
> A first advice is to try some tool that works with multicast / broadcast
> addressing method to allow a one to many connection. It means to work
> with UDP because TCP can not do multicast or broadcast. So you can save
> some bandwidth. Since RTP is not a transport protocol but a kind of
> application protocol over UDP, a tool RTP based can be used. If I'm not
> wrong, Icecast works with TCP. I dunno if it can be configured.
>
Thanks for that tip. I am currently looking at ffserver with ffmpeg. IIUC
it can support RTP too so that might be a good way forward. I have it
running on my device and I am testing the stream/codec combinations at the
moment. Gotta hand it to the ffmpeg devs for keeping keeping pace with the
market.
> Some questions:
> - Do you need to sync audio / video / MIDI?
Not really sample accurate but 2000ms is the limit for lag.
> - What is your audio / video / MIDI source? File? Cam?
/dev/graphics/fb0 + external BT microphone
> - How will it be used on the receiver? Monitor? Projector?
If I use ffserver the output will be displayed as a video stream at the
application level.
>
> Pure Data can send audio / midi / video.
>
I will look into PD if ffserver is unable to get the job done.
> If I'm not wrong, GStreamer can do it too.
>
> Cheers
>
> Schiavoni
>
> Em 01-07-2014 06:34, Patrick Shirkey escreveu:
>> Hi,
>>
>> Does anyone have a suggestion for open source solutions to enable
>> streaming AV/midi to multiple ARM mobile devices with a one to many
>> network configuration?
>>
>> I am looking at the following options:
>>
>> 1: ffmpeg streaming server
>> 2: icecast with netjack
>> 3: netjack
>>
>> There are some limitations.
>>
>> 1: Server is a mobile device with dual core ARM chipset
>> 2: Wifi connectivity with 20Mb/s total uplink from master server.
>>
>> An ideal implementation would allow 20 client devices to receive the
>> audio
>> stream in close to realtime. Upto 100ms delay would be acceptable.
>>
>> I'm weighing up the benefits from using FFMPEG to stream all the data
>> compared to a 32/64bit icecast stream with additional midi triggering
>> for
>> visual data located on the client app.
>>
>> - FFMPEG has the benefit of removing all trigger events but costs a lot
>> in
>> terms of bandwidth/power consumption.
>>
>> - Icecast is very good at serving audio but iiuc does not support
>> video/midi
>>
>> - Netjack can potentially do all three but is not well tested on a
>> mobile
>> platform.
>>
>>
>>
>>
>> --
>> Patrick Shirkey
>> Boost Hardware Ltd
>> _______________________________________________
>> Linux-audio-dev mailing list
>> Linux-audio-dev(a)lists.linuxaudio.org
>> http://lists.linuxaudio.org/listinfo/linux-audio-dev
>
--
Patrick Shirkey
Boost Hardware Ltd
Hi,
Does anyone have a suggestion for open source solutions to enable
streaming AV/midi to multiple ARM mobile devices with a one to many
network configuration?
I am looking at the following options:
1: ffmpeg streaming server
2: icecast with netjack
3: netjack
There are some limitations.
1: Server is a mobile device with dual core ARM chipset
2: Wifi connectivity with 20Mb/s total uplink from master server.
An ideal implementation would allow 20 client devices to receive the audio
stream in close to realtime. Upto 100ms delay would be acceptable.
I'm weighing up the benefits from using FFMPEG to stream all the data
compared to a 32/64bit icecast stream with additional midi triggering for
visual data located on the client app.
- FFMPEG has the benefit of removing all trigger events but costs a lot in
terms of bandwidth/power consumption.
- Icecast is very good at serving audio but iiuc does not support video/midi
- Netjack can potentially do all three but is not well tested on a mobile
platform.
--
Patrick Shirkey
Boost Hardware Ltd
On Wed, July 2, 2014 5:16 am, drew Roberts wrote:
> On Tue, Jul 1, 2014 at 11:34 AM, Patrick Shirkey
> <pshirkey(a)boosthardware.com
>> wrote:
>
>>
>> On Tue, July 1, 2014 10:41 pm, drew Roberts wrote:
>> > On Tue, Jul 1, 2014 at 5:34 AM, Patrick Shirkey
>> > <pshirkey(a)boosthardware.com>
>> > wrote:
>> >
>> >> Hi,
>> >>
>> >> Does anyone have a suggestion for open source solutions to enable
>> >> streaming AV/midi to multiple ARM mobile devices with a one to many
>> >> network configuration?
>> >>
>> >>
>> >> - Icecast is very good at serving audio but iiuc does not support
>> >> video/midi
>> >>
>> >>
>> > IIRC, icecast2 can stream video. Never thought to try midi.
>> >
>>
>> According to the icecast folks the latency and sync for a standard
>> stream
>> can get out to 10 seconds which is outside of my range. I could probably
>> handle upto 2000ms but less than 1000ms is preferable.
>>
>
> What are you thinking of using to do the "shouting"? IIRC, we were using
> vlc.
>
For this project I will probably have to build a custom tool that uses
ffmpeg for transcoding. VLC might be a good place to start but the
codebase is pretty large if I have to customise it so it's probably faster
to start from scratch.
> Concerning the sync, if you mean audio with video, what we were doing did
> not require synced audio.
>
This project probably doesn't require realtime sample accurate sync but
the latency should be within 2000ms between audio and video streams and
also between master/client. Latency should be as low as possible with a
balance between cpuload and bandwidth management.
Has anyone benchmarked realtime transcoding on dual core arm devices with
ffmpeg?
>>
>> Anyway I will give icecast with video a test run before I rule it out.
>>
>>
>> --
>> Patrick Shirkey
>> Boost Hardware Ltd
>> __________________
>>
>
> all the best,
>
> drew
> --
> http://freemusicpush.blogspot.com/
>
--
Patrick Shirkey
Boost Hardware Ltd
(Here is an unusual mail trying to describe as entertainingly as
possible what it's like to set up a modular production environment in
GNU/Linux. I owe the greatest of respect to anyone who've made the
softwares mentioned below and their work do not misunderstand a joke for
a blunt critic. )
Dear Lads,
You haven't heard much from me lately, and I'm sorry for that. For one
year and a half I've been sitting in front of my minimal -yet fairly
high endish - music production set up doing nothing but programming.
Those expensive things were only used to playback cd and digital files...
... you know what that means :
I was becoming an audiophile.
Holly horse poop, I can't beleive I said that... but it's true.
Few weeks ago, realizing this dreadful fact- I decided to start a rehab
: let's fire up the preamp and let's record some stuffs. Just to prove
the world I'm not just another bragging gear slut (hopefully).
Of course, just to make things harder I decided not to use the
full-featured highly expected (and probably amazing) Ardour 3. Why not ?
Because I've suffered 8 years of (gnu's not)Unix propaganda and now I
praise "modularity" above anything else.
Mh, and I guess 27 years of Legos(tm) didn't help neither (Don't judge
me, Technics are a great prototyping tool. ... . That's the best excuse
I have).
Lately, thanks to nedko (gladish), male (non-suite), drobilla (patchage,
ingen), falktx (Carla) (I always have to check the letters order for
this one nick :/) and few others,goal my has become -somehow- tangible.
Part one : THE SETUP
When you start a big music project with high expectations with GNU/linux
you need two things :
1) Make sure not to have a deadline
2) Make sure you are fully relaxed
Cause, you know how it is : those softwares are all quite young and only
used by few people. Therefore, "few people" means very few bug reports.
And of course those softs are made by volunteers who can only solve
them when they have enough time. Considering all this, it's already
great everything works !
Having few bugs every once in a while in a software is no big deal...
But the thing with modular audio, by definition, is that you're gonna
use dozens of them. At once.
And bugs DO stack.
But I'm rushing a bit, we're not there yet, cause at this point of the
story I haven't picked my softwares already (except for the non-suite).
I don't have any effects nor midi anything : I need to gather some more
stuffs to work with. And I must say I felt a bit like Indiana Johns
having to gather some pieces of a mysterious puzzle : "Your six strings
shall sound heavenly and You synths shall have many melodies. and you
will have to travel the Internet to find what you need."
Ok, the guitar part is an easy one : guitarix. It's just amazing. Let's
use it... and ! Oh gosh ! It's a trap ! There's no way to recall a
preset, it not patched with NSM... quick ! Use CLI, they MUST have a
"import" option ! Oh crap NO ! They don't !
I thought my quest would be easy but no. Instead of being able to pick
guitarix, I found a new quest : "use the LV2 version".
No big deal, let's just move to another place. Carla. it usually works
fine... but...gosh all my plugins are here except the amp simulator...
what the heck ?
Ingen then... erf... no, for some reasons this plugin doesn't make a
sound. Damn. Back to Carla... with another quest :
"You shall not use the Stable version. Use the git version". It's gonna
be harder than I thought.
Building is usually a piece of cake... but not this one time.
Eventually, in a last desperate attempt, I tried one more build, and
succeed.... Until QT5 -my trusted friend for years- suddenly decides to
betray me by placing booby traps all over the place to make sure I'm
not able to add any effect. Luckily, the scientist in the team -Falktx-
has been able to diffuse those.
Finally ! Now I have a guitarix preset I can save and load with NSM.
Holly crap.
Everything's set for the guitar. It's tome to move on to our glorious
quest for MIDI.
This quest was such a weird experience. I went to the vaste land of
Rosegarden, where the GUI almost made me blind and mad. I tried the
purity of a non-sequencer until it exploded right in my face. I went
to an oddly useless 1 bar non-chromatic sequencer...
Ok. It's gonna Seq24 then. It's not perfect, but I can use it.... Let's
just call the maintainer to fix the few problems... or not. Nobody's
actually taking care of this software.
Screw it. I'll use it anyway.
It's been a week already, and most of the musicians in my party have
given up for an easier world where you can buy peace.
I've been able to make some sounds and that's what keeps me going... I
don't know how long this will take, but I'll gather all the pieces of
this modular music machine.
Eventually with the help of the LAD's scientist, it'll work properly.
Let's keep faith.
Love.
Tumulte.
P.S. : As tedious as it was, I'm still thrilled by the potential of a
well integrated modular audio software suite. I must say I'm even more
exited when I see what's *already* possible today. Therefore I'd like to
make a call for a working group around this question cause it's really
(*really*) close to something great. I believe that a nice dialog
between users trying to setup real production environments and coders
can quickly bring out a kick-ass suite that can bring linux audio to the
next level.
P.P.S : all the best, and thanks for reading.