Please excuse my "youthful" enthusiasm...
I am having some success with my usb computer keyboard as a midi control
surface.
All things are assignable to any key.
the key events press, release, and repeat can be set separately (or
ignored)
Key presses can control:
- Jack transport:
roll
stop
zero
1 or ten sec forward or back (will make this settable)
I have press do one second and repeat do 10
- midi messages these are all in hex and can include:
channel messages like key, ctl, pgm, etc
sysex up to 20 char in total
I am not sure what the limit is for jack
but any of the control surfaces I have
studied seem to send less.
The reality is that there are lots of control surfaces out there and sw
has to handle what they send.
I do have a question though about ardour control by midi. The Mackie IF
says it sends control messages that are in the form num of ticks up or
down. This means I should be able to have a key send ticks up and another
ticks down, but this doesn't seem to work.
Also, It seems the master rec-enable is not toggle-able. Odd. I have tried
both making a map file and using the learn function. Is there more
documentation for this somewhere? It would not be impossible to toggle
things from inside the controler or take ticks up and down and convert to
127 or 1024 steps. This is not just for ardour but I am wanting to know
what is standard.
--
Len Ovens
www.ovenwerks.net
Hi,
I'm looking for a supplier who can provide a turnkey or white label LOW
COST linux compatible portable audio player. SoC embedded linux device.
10k - 150k units. Some hardware customisations may be required.
Any company that is interested in quoting on this project please contact
me directly to discuss the details.
--
Patrick Shirkey
Boost Hardware Ltd
Hi!
Let me forward you an article that was just posted to the AES67 mailing
list. If you consider to implement AES67 one day, this article gives you
a brief introduction to the clocks involved.
If you're already familiar with PTP (IEEE 1588), there's no use in
reading it, since AES67 of course uses PTP (like Ethernet/AVB).
For a real-world implementation, one can assume that PTP is taken care
of by linuxptp, one of the way too many competing PTP implementations.
To the best of my knowledge, linuxptp is really the way forward, all
other implementations are considered obsolete, at least on mainstream
Linux.
Without further ado, here's the article:
http://www.tvtechnology.com/audio/0098/for-aes-timing-is-everything/271322
Cheers
--
mail: adi(a)thur.de http://adi.thur.de PGP/GPG: key via keyserver
On Mon, Aug 04, 2014 at 12:18:31PM -0700, Russell Hanaghan wrote:
> I'm curious... This application excites me based on the following theoretical layout:
>
> Budget studio with say at least 1 strong central DAW in a control room. Other satellite
> rooms tht can be linked with Gig Ethernet and smaller (cheaper) platforms in those rooms
> for maybe recording certain instruments (drums might be a bitch even at low latencies)
> an feeding those channels back to the master. Effectively replacing shielded audio cables
> (which run into real money) for a Cat5e or Cat6 cable and gig ports either side. Gig
> switches and cards have become cheap, especially compared to half decent shielded pair
> audio cable.
>
> Is this reasonable as it relates to the application?
I'm not entirely convinced by the cost argument, unless you'd wire
a lot of channels. You still need a PC and soundcard at the other
end.
But if you can live with the extra latency (which can be kept low
in particular if you use dedicated NICs and wiring), yes this
could be a use case.
Another use case for a studio would be to make the studio output(s)
available everywhere in the building - offices, bar, etc.
I originally developed this to be able to record concerts at the
concert hall of the CdM in the studio which is at the other end
of the building and on a different floor. Installing audio cables
or an optical fibre was out of the question, but there is network
wiring everywhere.
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)
Anyone interested in beta-testing this please let me know.
Zita-njbridge
-------------
Command line Jack clients to transmit full quality
multichannel audio over a local IP network, with
adaptive resampling at the receiver.
Main features:
* One-to-one (UDP) or one-to-many (multicast).
* Sender and receiver(s) can each have their own
sample rate and period size.
* Up to 64 channels, 16 or 24 bit or float samples.
* Receiver(s) can select any combination of channels.
* Low latency, optional additional buffering.
* High quality jitter-free resampling.
* Graceful handling of xruns, skipped cycles, lost
packets and freewheeling.
* IP6 fully supported.
* Requires zita-resampler, no other dependencies.
Note that this version is meant for use on a *local*
network. It may work or not on the wider internet if
receiver(s) are configured for additional buffering,
and if you are lucky. The current code will replace
any gaps in the audio stream by silence, and does not
attempt to re-insert packets that arrive out of order.
You will need a fairly recent Jack version, as the
code uses jack_get_cycle_times() and no fallback for
that is provided.
Ciao,
--
FA
A world of exhaustive, reliable metadata would be an utopia.
It's also a pipe-dream, founded on self-delusion, nerd hubris
and hysterically inflated market opportunities. (Cory Doctorow)
As refered to earlier, I am working on an app to use a standard computer
keyboard for DAW (or other) control. I have started with actkbd and am
moving from there. actkbd remains fully usable, but I have added things to
make it useful for control of DAW apps. My first attempt at jackd
interface was to allow keys to control the jack transport. The actions I
set up are:
roll
stop
zero
forward 1 sec (48kframes)
forward 10 seconds (480Kframes)
back 1 sec
back 10 sec
Because of how actkbd is set up, the key use for these is fully
configurable. I have been using the numeric keypad:
Enter = roll
0 = stop
+ = forward
+ repeat = fast forward
etc.
The repeat can be used or ignored. In the future I would like to set up
the forward and back so the user can configure the number of frames rather
than having just two choices, but I am more interested in proving the
concept first.
My next thing is to set up (jack)MIDI out (the port shows on the graph so
far :)
However, while testing this (with ardour as happens), I am wondering about
the merrits of using jack transport at all. That is, would it be better to
only use midi to control one application's control of the transport rather
than controling the transport directly? In this case the user would have
the option of which to use because they do not have to configure any keys
to send transport actions. But I am wondering if it would cause problems
that could be easily solved by not offering transport control at all.
Lots of things still to figure out:
- send unused keys to another system kb interface so X will grab it
(allows spliting the keyboard)
- output MIDI info on both jack and alsa.
- gracefully not use jack outputs if jack is not running
- detect jack showing up and start using jack outputs
- create some sample, but useful config files for those who don't want to
- create a GUI to make config files
So far, I have been using a second keyboard which is "grabbed" so tha X
doesn't see it.
--
Len Ovens
www.ovenwerks.net
Sorry about the posting on LAA, the reply should've been discarded but
instead I ticked accept :(
Anyhow, I'm hereby forwarding it to LAD because it fits better here I guess.
Bye,
Jeremy
-------- Original Message --------
Subject: Re: [LAA] jackwsmeter 1
Date: Mon, 14 Jul 2014 11:13:56 +0200
From: Guillaume Pellerin <lists(a)parisson.com>
To: linux-audio-announce(a)lists.linuxaudio.org
Hi Fred,
Great project!
I get this:
momo@wm22:~/dev/audio/jackwsmeter$ make
CC jackwsmeter-jackwsmeter.o
jackwsmeter.c: In function ‘callback_http’:
jackwsmeter.c:92:3: error: too few arguments to function
‘libwebsockets_serve_http_file’
In file included from jackwsmeter.c:41:0:
/usr/local/include/libwebsockets.h:1077:1: note: declared here
jackwsmeter.c:125:7: error: ‘LWS_CALLBACK_SET_MODE_POLL_FD’ undeclared
(first
use in this function)
jackwsmeter.c:125:7: note: each undeclared identifier is reported only
once for
each function it appears in
jackwsmeter.c:129:7: error: ‘LWS_CALLBACK_CLEAR_MODE_POLL_FD’ undeclared
(first
use in this function)
make: *** [jackwsmeter-jackwsmeter.o] Erreur 1
Cheers,
Guillaume
_______________________________________________
Linux-audio-announce mailing list
Linux-audio-announce(a)lists.linuxaudio.org
http://lists.linuxaudio.org/listinfo/linux-audio-announce
Hi,
i would like to announce some tools around OSC.
oschema: a format definition to describe OSC units
https://github.com/7890/oschema
oscdoc: create HTML documentation from oschema instances
https://github.com/7890/oscdoc
txl: a simplified text format that can be translated to XML (and vice versa)
https://github.com/7890/txl
Basic idea:
-Having a standardized, machine-readable format to describe an OSC API
-Derive "stuff" from description, like documentation, code skeletons etc.
-Let programs use OSC API dynamically by looking at definition
Proposed workflow:
-Write OSC API definition using txl (optional)
-Convert txl to XML (optional)
cat my.txl | txl2xml > my.xml
-Use post-processing chain for desired output (i.e. oscdoc)
oscdoc my.xml /tmp/mydoc
If i got your attention, please get a quick overview before cloning here:
http://lowres.ch/oschema/oschema.html (oschema documentation)
http://lowres.ch/oschema/oschema.svg (interactive structure)
http://lowres.ch/oscdoc/unit.txl (an example txl file describing an OSC unit)
http://lowres.ch/oscdoc/unit.xml (corresponding XML file, oschema instance
document)
http://lowres.ch/oscdoc/index.html (output of oscdoc)
Please let me know when you find anything unclear or missing.
Have nice day
Tom
Just looking through the midi messages used by the Mackie and other
controlers. It appears to me that it is designed to be used on it's own
physical midi channel as it uses (up to) all the logical midi channels (1
to 16) and so could not work running through a midi kb for example. It is
just the faders that do this using the pitch control on each channel, all
the switches, lamp signals and encoders are on channel 1 (or 0 if you
like) which would also conflict with some of the older KB like the DX7
which use only channel 1 also. Now obviously, if the controler is a USB
device, it creates it's own midi port anyway. I took a look at Ardour
(cause that is what I use to record) and it has a port dedicated to the
controler (and other things) so that is not a problem.
My question then is do all applications that can be controled by midi have
a dedicated port for control? (well most anyway The non-group of
applications accepts midi control of transport, but not the mixer which
wants midi converted to CV first... making it effectively not controlable
via a control surface without some sort of SW interface to take care of
banks etc.) Hmm, directly controling jack transport might be an idea.
--
Len Ovens
www.ovenwerks.net
Hello new to this list but realized I was asking a lot of development
questions in LAU that would probably be better here.
I am writing a program that takes some system input from the keyboard and
redirects it as midi. This is to allow:
- a second keyboard to be used as a midi control
- switching the system keyboard back and forth between normal and midi
controler
- using a potion of the system keyboard as a midi controler (the numeric
pad for example)
The advantage of this over keyboard short cuts is that this will work the
same no matter what the desktop focus may be.
I am (as many people do) writing this for my own use, but will make it
available for others who may have similar needs.
My question:
Where and how is the best place to put the MIDI port?
My first thought was to get it to create a jack midi port as this would be
the most direct. The problems I see with this are, 1) my program will run
as a different user than Jackd, 2) my program will run before and perhaps
after jackd.
SO then I thought I guess I have to use ALSA for my midi port. I need to
know if I do this, will jackd that is running a different backend than
alsa (firewire for example) still see the alsa midi ports? I am guessing
that a2jmidid would still work in any case, but not everyone uses that.
Assuming I have to use alsa midi (unless some one can suggest how to make
this work with jack midi), I have noticed that some aplications that use
alsa midi do not leave a visible port in alsa and do all connections
internally which would be useless in this case. There are some examples of
code at http://www.tldp.org/HOWTO/MIDI-HOWTO-9.html and I was wondering if
this method will leave visible midi ports. I am also noticing it 2004 ish,
is the info still valid, or is there a better place to look?
The keyboard input and grabbing will be based on the code from actkbd as
it seems to already do all that stuff and is gpl2 (and besides the code
looks understandable to me)
--
Len Ovens
www.ovenwerks.net