On Tue, 18 Oct 2016 21:01:44 +0200
Emanuele Rusconi <emarsk(a)gmail.com> wrote:
Hello,
Hi jonetsu, I'm trying to follow the discussion
but I still can't
figure out what you'd like to achieve.
A discussion.
You talked about plugins that "apply themselves
as a group to an audio
track, to stems, to a whole session". That's what buses are for,
unless I misunderstood what you mean.
Well, if the whole sentence was taken, there could perhaps be more
sense. The whole sentence was:
"Plugins that network themselves together, sharing audio
analysis, sharing data, taking decisions together on how to
apply themselves as a group to an audio track, to stems, to a
whole session."
The quality, or feature, would be applied in those three contexts. The
feature, or functionality, itself consists of the sharing of audio
analysis.
You talked about a suite of plugins which
auto-configure themselves
based on track analysis. Correct me if I'm wrong, but that doesn't
require (just) inter-plugin communication, it requires that the
entire suite is developed together as a unit, you can't just make
different random plugins talk together and magically "make a decision
together" - whatever that means - (hence Paul Davis' fist comment:
"[to] do what they want, they need their own DAW").
I disagree on that, based on so many protocols existing out there that
are not proprietary, that can be used openly.
And then there's the major issue that others
pointed out: to configure
themselves in a meaningful manner, the plugins must be able to
analyse not only the audio, but also the engineer's mind to know what
he wants to achieve.
There is no need to do mind analysis and it is actually very surprising
that this is brought up, as some kind of B-series SF flick. I do not
know if, when the first automatic transmissions were proposed, that some
people have exclaimed "oh ! we will loose control, we need to stick
(pun ?) to manually shifting the speeds as the auto transmission does
not know". As we know today automatic transmissions do respond to what
the driver wants, in their own way. Floor it, and it will downshift.
Likewise with anything else. This is no 'Robot Mixer Invaders From
Loudness Hell' type of thing. To assume that engineers will de facto
loose total control is, totally, absurd.
Personalizing the sound IS part of and HAS impact on
the mixing
stage, it's the goal that no auto-configuring plugin can read in your
mind.
Again. Slight variation here, is that this is about the mixing stage.
When people go and hire a mix engineer, most of them have figured out
that the guitar part will be electric and have a Big Muff and will not
be an acoustic 12-string. This is already done. The colours and tones
and all that is part of the composition and happens at an earlier
stage, in very large parts.
So this is about the mixing stage, where the composition of overall
tone is already made. The mixing engineer can modify some things, but
he is basically a stranger to the client's project.
I don't
see what would be wrong with having a set of presets that
supposedly replicates a
certain guitarist setup.
I agree here, nothing wrong with (good) presets. But
that's something
we can already have. Look at Guitarix for example, you can have
presets that tie together different tools: compressor, overdrive, amp
emulator, cabinet emulator, etc.. And you can do that because those
tools are developed to work tightly together, they are like different
parameters of one big plugin. And, again, I'm not sure what data
would they need to share in your vision.
Yes there are actually tons of these, and tons of discussion. It's
even possible to buy as a kit, the same equipment that Flea uses, bass,
amp, speakers, pedals, and all that. Some people are totally dedicated
to take apart the sound configuration of some of the famous musicians
out there.
Your example above stems from Guitarix. This happens way before the
mixing engineer is on the payroll. Hence the comparison is not valid
at all.
To me, it looks like you're pointing at the moon
at noon: sorry but
either the moon is somewhere else or I just can't see it anyway.
You keep talking about plugins communicating together, but to me it's
not clear WHAT would they supposedly communicate.
You keep saying "making decisions" as if it would mean anything clear
but, unless you explain it better, it sounds like either marketing
fluff or bad-by-design auto-configuration.
Keeping it simple, let's put forward some examples.
As you might have guessed by now, it is based on principles. So let's
bring one:
Separation. Clean separation of instruments in the audio field is a
must. You cannot do without it, unless the mix is meant to be leaning
towards mud, which can be a style on its own. Do you see how plugins
communicating between themselves could set up a basic separation
framework to assist the mixing engineer ?
Another one: Three dimensional placement of objects. Likewise here,
plugins that can share data, and based of what they did about
separation, can establish, *based on configuration styles or otherwise*,
a basic placement of objects for the mixing engineer to refine.
These two tasks have large technical parts to them and yet, they have to
be done for every single mix.
In Mixbus now there is the analysis of phases between tracks, analysis
that will actually optimize the phase settings. This is a simple,
uncomplicated (in its own way) step towards having more of the same.
No-one has said "well, I do want the left room mic for the crash to be
out of phase by that much so this thing in Ardour is total crap." Nope.
And from the phase feature in Ardour/Mixbus we can see why people would
think you need a DAW.
No, it would be about extending and sharing analysis amongst plugins,
based on an open protocol that allows for decision-making.
Well, the way it goes, a company will do it and then 3 years later, or
5, there will be a parallel open source project. :)
What I'm trying to achieve ? Just a discussion.
Cheers.