Hi Francesco,
Am 18.03.2017 um 18:07 schrieb Francesco Ceruti:
> "JACK-Client" is used by one of the output-elements, [...] if you don't have JACK-Client and the element fail to load, it
> will not be visibile in the options.
thanks for the comprehensive answer. I figured out as much in the
meantime and added a new "python-jack-client" package to AUR and made it
an optional dependency of linux-show-player.
Overall I'm very impressed with the polishedness (is that a word?) of
the application.
Cheers, Chris
Hi,
what exactly does the fader of Ardour's mixer for MIDI tracks control?
I noticed that CC 7, CC 10 etc. could be selected by the editor, in
addition to the mixer's fader. CC 7 does control the volume, while the
mixer's fader seems to have impact to the note velocity.
I suspect it simply increases or decreases the recorded note velocity,
but by what values?
= 64 means velocity +- 0?
> 64 means velocity * 1.n to velocity * 2?
< 64 means velocity * 0.n to velocity = 0?
Or does it something completely different?
Regards,
Ralf
Being the OP (I think) I didn't mean to stir controversy with my questions.
As has been pointed out there are different ways to get to your preferred
configuration. Be that qjackctl, your own home brew, or some combination of
things.
Personally, most of the time, I just want it to work so I can get about MY
business and not the business of becoming a guru on some part of the chain
I need to get MY business done.
(My OP came from a desire to figure out why a particular configuration
seemed to be so finicky. A particular laptop, the alsa interface, and USB
in/out ports presented by a Behringer X32, plugging the same usb cable into
another laptop, just worked every time. In order to troubleshoot, I thought
maybe I'd gain more insight by mimicking what qjackctl does from the
command line. But, when I tried to duplicate a qjackctl start from the
command line, was stymied as to how.)
The entire linux audio (generic and pro) and the evolution thereof makes it
very difficult to get a clear picture of how it all works and how to get
what you want. (I've been a hardware/software engineer for 40 years,
messing with jack for the last 10 and still have questions...) To a new
comer or someone coming from a different audio platform, it would be hard
not to wonder if linux audiophiles aren't trying to make appear you need to
be some sort of linux mage to make it work. When in fact, you just need to
be a persistent investigator due to it's evolution (and lack of cohesive
end user documentation, again, because of it's evolution). Oh, and some
background in software, hardware, signal processing, and system
configuration doesn't hurt... For those who just want to make music or
record something quick...better stick to other platforms and cough up the
money that requires.
For me the corner case argument is moot. There a myriad of use cases and
work flows. In my case, I, mostly, do theater work where playing
prerecorded tracks, controlling various devices (mixers, projectors, sound
fx, players, etc.), and following a script is typical. For sound, that
typically, means using player apps that may or may not be jack aware
(though, when the choice is mine, I just use LinuxShowPlayer, since it
knows JACK), so having the pulse ports just appear when jack is started (by
whatever means) is nice. And, while I RARELY use Skype, being able to play
youtube without having to mess around is handy. Yes, I know I could get it
all to play nice with just jack...but, that's not what I prefer to use my
time doing.
And, for full disclosure, I use JACK to get to MIDI in ShowControl, the
open source theater production control software we are developing. But, I
don't have any desire to know more about the internals of jack than I need
to to use it as a tool. (but I need to figure out how to get that to start
with the preferred sound configuration...so I'll be looking in detail at
things like Christopher's jack-select, alas, deeper into the rabbit hole.)
;)
It's all good...just complicated. :)
Regards,
Mac
On Tue, Mar 14, 2017 at 1:28 AM, Ralf Mardorf <ralf.mardorf(a)alice-dsl.net>
wrote:
> On Tue, 14 Mar 2017 00:28:20 +0100, Christopher Arndt wrote:
> >Am 13.03.2017 um 20:04 schrieb Ralf Mardorf:
> >> DAW users very seldom need jackdbus and other users very seldom need
> >> the jack sound server at all.
> >
> >That's just your opinion, though. I find jackdbus very useful. I often
> >switch audio interfaces and JACK's dbus interface makes this a lot
> >easier.
>
> That's ridiculous, jackdbus might have advantages for some usage, but
> your examples don't cover them.
>
> >I wrote a little desktop systray app to switch JACK configrations via
> >dbus (I already announced this here a while ago):
> >
> >https://github.com/SpotlightKid/jack-select
>
> Selecting a sound card could be done with qjackctl, ardour, by command
> line and a lot of other ways, too, when using jackd.
>
> >For auto-connecting JACK audio and MIDI ports, I wrote a little helper
> >too:
> >
> >https://github.com/SpotlightKid/jack-matchmaker
>
> Connections for jackd audio and midi as well as alsa midi could be
> stored and restored by e.g. aj-snapshot, too.
>
> On Mon, 13 Mar 2017 14:27:09 -0700 (PDT), Len Ovens wrote:
> >This is a case where the combination of pulse and jackdbus work quite
> >well. Wanting to have a DAW on the screen and be able to, without
> >exiting from the DAW, play a u-toob vid...
>
> You don't need pulseaudio to do this, but even if it would require
> pulseaudio, it's a corner case.
>
> >There are a number of workflows where a pulse/jack combination is the
> >only way to make it work. (radio studio that uses skype for callin or
> >interviews comes to mind)
>
> Apart from skype interviews, what are those numerous workflows? How
> often do you hear skype interviews on radio? This is a corner case.
>
> Regards,
> Ralf
> _______________________________________________
> Linux-audio-user mailing list
> Linux-audio-user(a)lists.linuxaudio.org
> http://lists.linuxaudio.org/listinfo/linux-audio-user
>
Hi,
I'm running multiple PC's with UbuntuStudio 16.04.
I'm trying to figure some differences in what happens between the PC's when
jack is started.
Questions:
- what is the process of determining what combination of jackd, jackdbus,
jack1, jack2 is actually being run from the command line or from qjackctl?
- since starting jack from the command line doesn't result with a pulse
source/sink, but starting it from qjackctl does end up with pulse
source/sink and system source/sink? (And, how do I duplicate what happens
when jack is started from qjackctl from the command line?)
Regards,
Mac
— Apologies for cross-postings, please distribute —
__Algorithms that Matter__
Artistic Research Residency - Call for Applications
https://almat.iem.at/call
Algorithms that Matter (Almat) is an artistic research project funded by
the Austrian Science Fund FWF, PEEK AR 403-GBL, and based at the
Institute of Electronic Music and Acoustics (IEM) in Graz, Austria.
Almat is happy to announce a call for three artistic residencies to take
place in 2017/18!
__What__
Algorithms that Matter is an artistic research project by Hanns Holger
Rutz and David Pirrò. It aims at understanding the increasing influence
of algorithms, translating them into aesthetic positions in sound,
building a new perspective on algorithm agency by subjecting the realm
of algorithms to experimentation.
Almat is grounded in the idea that algorithms are agents that
co-determine the boundary between an artistic machine or “apparatus” and
the object produced through this machine. The central question is: How
do algorithmic processes emerge and structure the praxis of experimental
computer music? The hypothesis is that these processes, instead of being
separated from the composer—as generators and transformers of infinite
shapes—exhibit a specific force that retroacts and changes the very
praxis of composition and performance.
A series of systematic experiments is carried out by the project team
together with the artists in residence. Over defined periods of time,
artists develop series of interrelated sound pieces that interrogate
specific aspects of algorithmicity. The work process is observed and
transcribed into complementary forms of presentation and discourse,
including concerts and exhibitions, an online public “continuous
exposition”, and gatherings that connect artists-researchers across
various institutions in Europe.
The project not only aims at extending the praxis of experimental
computer music using algorithmic processes, but also at contributing to
the scope and methodology of artistic research.
__Who__
This call is directed at experienced practitioners in sound art and
experimental computer music, for whom algorithmic and computational
strategies represent a central element in their work. Applicants should
be able to demonstrate this through their past work. They are capable
and interested in exposing and reflecting their process and work through
writing, discussion and other forms of discourse. During the project,
they are expected to either engage with software frameworks that the
project team has developed, or with software they have developed in open
source environments.
__Residencies__
The project offers three residencies in the period from October 2017 to
December 2018
(see possible periods listed in the application form).
The project is conceived in such way, that applicants will work
collaboratively with the project team. Each residency will be preceded
by an online preparatory process in cooperation with the project team
(this will begin at latest two months before the start of the actual
residency). The applicants are then expected to spend the two months of
the residency in situ.
The residency's aim is to carry out algorithmic experimentation (in the
form of compositions, installations, etc.) in an iterative setting, with
focus on process and specific constraints. Applicants are expected to
work both independently and collaboratively during the residency. An
additional researcher will observe and document the work process and
contribute to the public exposition of the traces.
The tangible outcomes will be presented locally and internationally by
the project team, and the artists are invited to attend events related
to the project. We want to emphasise that research is the prime focus of
the residency, but that we intend to provide opportunities for public
presentation of the works and artefacts created through the process.
__Application__
The applicants are commended to carefully read through the summary,
research questions and methodology provided on the project's website:
https://almat.iem.at
Please fill out the form provided at https://almat.iem.at/call.html and
send it to almat(a)iem.at along with the required accompanying documents.
We aim at a balance of gender and background of the applicants,
asserting the diversity of the project.
__Conditions__
- Duration of residency: 2 months
- Start date (in situ): October 2017 (select time slots in form)
- Remuneration: c. 5,000 EUR total (contract for work; subject to
taxation where applicable). This sum is meant to cover all expenses,
such as travelling, living costs and accommodation.
__Application deadline:__ 16 April 2017 (e-mail reception, 24:00 CET)
If you have further questions, please do not hesitate to contact us at
almat(a)iem.at.
__Additional Information__
The Institute of Electronic Music and Acoustics is a department of the
University of Music and Performing Arts Graz, founded in 1965. It is a
leading institution in its field, with more than 25 staff members of
researchers and artists. IEM offers education to students in composition
and computer music, sound engineering, contemporary music performance
and musicology.
What we offer:
- 24/7 access to the facilities of the IEM (http://iem.kug.ac.at/en).
- A desk in a shared office space for the entire period and access to
studios including the CUBE which has a 24-channel loudspeaker system and
motion tracking, according to availability.
What we cannot offer:
- We can not provide any housing, although we will try to help with
finding accommodation.
On Mar 9, 2017 09:15, jonetsu <jonetsu(a)teksavvy.com> wrote:
>
> On Thu, 09 Mar 2017 09:03:21 -1000
> David Jones <gnome(a)hawaii.rr.com> wrote:
>
> > On Mar 9, 2017 08:24, "Peter P." <peterparker(a)fastmail.com> wrote:
> > >
> > > * David Jones <gnome(a)hawaii.rr.com> [2017-03-09 18:55]:
> > > > On Mar 8, 2017 23:50, "Peter P." <peterparker(a)fastmail.com>
> > > > wrote:
> > > > >
> > > > > Hi dear list,
> > > > >
> > > > > I was wondering if anyone has ever successfully tried to
> > > > > extract audio files embedded into adobe pdf files?
> > > > >
> > > > > thanks
> > > >
> > > > Hmm, play the PDF while running jack_capture to save the audio
> > > > output to another file?
> > > Hi, and thank you! In order to play the audio in the PDF I guess I
> > > would need to install adobe reader (which I want to avoid).
> >
> > Well, I suppose you could investigate whatever parts of the PDF
> > standard pertain to embedding audio and maybe figure out how to use
> > Linux tools to programmatically extract it. I have no idea how to do
> > that.
> >
> > I think installing Acrobat Reader would be faster. You could
> > uninstall it afterward.
>
>
> Extracting audio from PDF files in Linux is well documented. Simply
> use google and be certain not to use search words like 'shootout',
> 'Spectre', 'James Bond', 'wild west rhetoric' and 'mix wars'. Provided
> the OP is able to do that, results could be obtained. :)
I typed "Linux how to extract audio from a PDF file" into my fave search engine and got useful links.
David W. Jones
gnome(a)hawaii.rr.com
authenticity, honesty, community
http://dancingtreefrog.com
Hi list,
I have a few bugs and questions when running the Reaper DAW under wine
which sit unanswered on the Reaper (Cockos Confederated) Forum, so I
take the liberty to ask on this list if anyone has experienced similar
problems.
1,)
Reaper loses all keyboard shortcuts. This usually happens after having
switched from/to Reaper and other applications of after switching
desktops multiple times.
2.)
The "Save Project..." dialog suggests EDL TXT (Vegas) as default file
extension, which can be changed by the user manually every time, but is
very annoying.
3.)
when I am adding a single mono audio file to an existing Reaper timeline
under Linux/Wine from a graphical filemanager (xfe) using drag+drop, the
file is inserted twice, on two adjacent tracks, but with a time-offset
between the two copies.
All three issues are with Reaper 5.30 under Wine 1.6.2-20 in Debian testing with
Window Manager Fluxbox and have existed since several older versions
of Reaper already.
thanks!
Peter