I have a desktop computer and Raspberry Pi 3 B+.
I want Raspberry Pi to control my speakers and share my speakers with my
desktop over a direct ethernet connection.
USB microphone is going to be connected to Raspberry Pi and be shared with
the desktop computer over the ethernet connection, too.
Can netjack1/2 over a direct ethernet connection substitute for local ALSA
audio?
Will video and audio synchronize if netjack was used over a direct ethernet
connection?
My Raspberry Pi can turn audio gears on and off via relays.
When my desktop starts using its speakers, I want it to tell Raspberry Pi
to turn on the speakers via a relay.
When nothing has used the speakers for a minute, I want it to tell
Raspberry Pi to turn off the speakers.
I can do the same for a USB microphone. A USB microphone can be turned on
and off by uhubctl.
Does ALSA have a callback API that I hook into? Does jack have it?
For Yoshi and Zyn, there are a very large number of patches that have no
copyright/license information filled in. In the first place this makes it
impossible to give any kind of acknowledgement. Potentially more critical is
the new messy copyright legislation staggering through the EU.
I don't know if there is any practical way of adding this to old files, but at
least when creating new one *please* fill these fields in - even if you don't
think you'll ever release them.
As and aside, a lot of these also have no 'Type' filled in :(
Apologies for the cross post - not really sure where this should go.
--
Will J Godfrey
http://www.musically.me.uk
Say you have a poem and I have a tune.
Exchange them and we can both have a poem, a tune, and a song.
Dear list,
to avoid the https://en.wikipedia.org/wiki/XY_problem I'm going to tell
you my goal first, then my current route to a solution, with a missing
piece.
I work with several midi instruments. Some of them USB, some of them
DIN, all of them have different names. Only ever one is connected at
once.
I want my programs to accept data entry from my instruments, no matter
which of them is plugged in.
What is the easiest way to do this?
My current idea is to establish a virtual jack midi thru port. All
software can use this port as midi data source.
Hardware instruments then get be connected via one of the many
auto-connector solutions: The one built into QJackCtl or
https://github.com/SpotlightKid/jack-matchmaker , which is what I am
using already.
The virtual jack midi port is the missing piece.
What is the best way to get one? "Best" means least resources, least
administration overhead. E.g. any answer involving the words "plugin
host" is already too much. Ideal would be a simple daemon program I can
autorun after starting jack (with qjackctl script-after-start)
So I am either looking for that midi thru port or a different solution
that works even better.
-hgn
P.S.
In case the JACK devs are reading:
AFAIK Jack works under the assumption that audio System out 1 and 2 are
the main abstracted stereo outputs and are available on any system. This
leads to great portability across all systems.
I consider midi data entry with a single instrument an equally typical
use case and would like to propose to add a single midi thru port to
JACK directly. No more, no less, nothing to configure, nothing to check
for as user or developer.
On March 16, 2019 6:26:58 PM GMT+01:00, Moshe Werner <moshwe(a)gmail.com> wrote:
>
>
>
>Hey,
>
>Afaik, usb2 CC works up to 48kHz...
>I may be wrong though.
>
>Cheers
>Moshe
>
>
>
At least that is how it seems to be in this case with this device. (but I think i had 96k running even on usb1 devices before)
Than the question would be what has to be done to make 96k available on Linux. (they have a asio driver for Windows and it seems to work out of the box on osx)
--
Sent from my Android device with K-9 Mail. Please excuse my brevity.
Hi all,
A group of people who do not attend the Linux Audio Conference in
Stanford this year are meeting at c-base.org to participate remotely.
http://lac.linuxaudio.org/2019/#program
We'll watch the live-stream of the paper-presentations on Saturday,
Sunday, Monday 17h to 21h CET, and share dinner, which is sponsored by
the Paul Davis and the Ardour community!
You're very welcome to join our small local hallway track in the space
station below Berlin-Mitte, located in the 2nd backyard of Rungestraße
20, 10179 Berlin.
There'll also some un-conference BarCamp track, likley some LV2 hacking
on the week-end. Monday evening overlaps with the Bitwig user-group at
c-base. Going to be fun!
Looking forward to seeing you there,
robin
PS. Safe travels to all who head to Stanford.
Greetings,
I do still play a bit and even write the occasional old-fashioned song.
https://soundcloud.com/davephillips69/fair-enough-aka-the-afaict-song
No great claims for production values, but I hope you find it enjoyable.
Recorded with two mics, a cheap Ibanez electroacoustic, and Ardour6-dev.
Best,
dp
On 3/16/19 10:54 AM, Lorenzo Sutton wrote:
> On 16/03/2019 21:34, Brett McCoy wrote:
>> In the "Add Control Ruler" menu in the Matrix Editor, there are only
>> these CCs listed
>>
>> 10 Pan
>> 93 Chorus
>> 7 Volume
>> 91 Reverb
>> 64 Sustain
>> 11 Expression
>> 1 Modulation
>>
>> How can I add a ruler for an arbitrary CC# -- for instance, I am
>> sending MIDI to an external sample player that has CC#2 available for
>> playing an ornamented note. How can I send this from Rosegarden (aside
>> from using Event Editor, which is painful)
>
> Try this..
>
> Open: Studio > Manage MIDI Devices
>
> In the top-left area select the button "Controllers..." for the midi
> device you are connection the sample player to.
>
> A window where you can manage including adding etc. CC# appears.
>
> When you're done the newly added CC will appear in any "Add control
> ruler" (button with the star icon), in e.g. matrix or notation editor.
>
> I used this quite a bit to have CC71 and CC74 rulers when playing to
> yoshimi ;)
So what do CC71 and CC74 do when sent to yoshimi?
--
David W. Jones
gnome(a)hawaii.rr.com
authenticity, honesty, community
http://dancingtreefrog.com