hi *!
this unfortunate announcement from nedko seems to have spawned a
discussion on LAA. LAA list policy used to be no follow-ups except for
factual corrections. the idea was to keep the traffic low for people who
want to stay informed but not have to wade through too much mail. (think
lwn.net among others.)
can i suggest that all further contributions to this LAA thread be NAKed
by the moderator, please? the discussion is certainly important, but it
should continue on LAD or LAU.
that said, while i can understand nedko's POV, the initial message never
really belonged on LAA either. obviously, such announcements _will_
cause heated discussion, so they should be made in a forum which allows
them.
best,
jörn
-------- Original Message --------
Subject: [LAA] my lv2-related projects
Date: Tue, 21 Aug 2012 06:34:49 +0300
From: Nedko Arnaudov <nedko(a)arnaudov.name>
To: linux-audio-announce(a)lists.linuxaudio.org
I'm abandoning all lv2 related projects that I currently maintain.
Here is a list:
* zynjacku/lv2rack
* lv2fil
* ssg
* lv2vocoder
* lv2dynparam
* external ui extension
* lv2zynadd [partially, see below]
* maybe something else I dont recall right now
The zyn-rewrite project that produced lv2zynadd stays but will be
cleared from all lv2 code. If anyone wants to take over the
maintainership of any project of mine, contact me. I'll wait a month
before wiping out all online lv2-related resources I control.
I don't want to participate in the lv2 madness anymore. I admit I cannot
communicate rationally with David Robillard. If contributing is not
pleasure, then a one doesn't belong to the community. I wish everyone
inloved more luck than I had.
--
Nedko Arnaudov <GnuPG KeyID: 5D1B58ED>
> > My concept with GMPI (not everyone agreed) was that MIDI was not
> > required *in* the plugin.
> > I decided the *host* should provide that routine (mapping MIDI to port
value). Written once,
> > available to every plugin developer.
> This is almost exactly what I proposed as an LV2 extension in this
> previous thread - " In practice, however, there are a few border cases
where the plugin
> would want to indicate its default MIDI bindings".
Cool, I like it. I disagree that synthesisers are 'border cases' though ;)
> The only real worry is that hosts will be unhappy of the "bloat" added
> to the library they are using.
Yeah, Host developers want the 'bloat' in the plugins, plugin developers
want the 'bloat' in the host.
I think I good experiment is to imagine you have to write both an LV2 host
and 100 LV2 plugins, and you have to write MIDI-binding code. Do you put it
in the plugin OR the host?
-If a feature consumes 100 kB RAM and disk space, and it's implemented on
the host side - that's 100 kB.
-If it's implemented on the plugins side, that's 100,000 kB.
Which choice is more 'bloated'?
A very real scenario is you write this MIDI-binding support, ship 50
plugins, then 6 months later discover a bug. Now if that feature is in the
host - that's one fix and everyone is happy. If that bug is in the 50
plugins, already shipped to 1000 customers. Then you have a much bigger
problem.
It's not a question of 'bloat' YES/NO. The code has to go *somewhere*, there
is only a tradeoff - HOST vs PLUGIN.
My choice was to have very lightweight plugins, and a more sophisticated
host.
P.S.
The one other reason you want the host handling the MIDI Binding...
> On Fri, 2012-06-08 at 09:45 +0000, Jeremy Salwen wrote:
> > Currently, a plugin writer is in a bit of a sticky situation: if the
> > plugin supports MIDI CC events, then the internal parameters are
> > hidden from the host. You can do something where you have a switch
> > which toggles between MIDI CC control, and Control Port control, but
> > this is not a fun thing to do, and I think it is additionally
> > confusing for the user.
True, a plugin's parameters can be set by many sources:
* pre-recorded automation.
* The user tweaking the GUI.
* MIDI messages.
What if they all happen at once? Only the host is in a position to mediate.
For example if you click-down on a plugins GUI slider - you don't expect
that slider to continue jumping arround in response to MIDI automation. The
human is in control until the mouse-up happens, then automation resumes
control. This is a host feature, it can only be implemented of the MIDI
message is decoded by the host, the plugin can't be changing it's own
parameters 'in secret'.
Best Regards,
Jeff
Hi all,
I am porting to LV2 some AMS-influenced plugins (mainly those by Fons)
which have odd 1/Oct frequency ports. I understand why it is sometimes
convenient to use octaves rather than the more typical Hz for frequency,
but after some digging to figure out how to precisely describe this
unit, I discovered the central frequency is middle C, i.e. C4, i.e.
around 262Hz.
For hosts to be able to use such plugins effectively, I need to
precisely describe this unit (and then other plugins can implement to
spec and they will all get along). We already have an 'octaves' unit,
but no base frequency is defined. I can add one, but I am not sure
about this strange choice.
Nobody tunes anything based on middle C, with its odd frequency of
261.62556... writing this in a spec gives me pause. I suspect it
evolved from MIDI code in AMS where the 60 of middle C looks as
reasonable as anything else, but when you try to actually
define/document the unit it looks silly.
I think the natural central frequency to use is A440, at precisely a
nice round 440.0 Hz, so 0.0 is A4=440.0Hz, 1.0 is A5=880.0Hz, -1.0 is
A3=220Hz, and so on. Whenever a default or center or tuning frequency
is needed, you use A4/440Hz...
tl;dr: I think the most reasonable standard for an absolute 1/oct
frequency unit is 0.0 = 440Hz
Thoughts?
-dr
Hi,
Does anyone know a timecode library, that allows converting, adding,
subtracting SMPTE timecode with knowledge about drop-frame timecode, etc
...that can be used in C programs.
The only timecode lib I was able to find is 'mffm-timecode'.
It is C++ only and not really concerned with SMPTE timecode.
I'm thinking about writing one - unless I can find sth suitable.
Five of my current projects include replicated code for dealing with
SMPTE. Some use libltcsmpte.sf.net functions to some extent.
I'm currently refactoring libltcsmpte. As it was the first lib I've ever
written, I did make a couple of mistakes in designing its API/ABI. The
new incarnation of libltc only concerns itself with LTC frames and drop
SMPTE. -- So I'll need a libsmpte of some kind.
TIA,
robin
> Since the unofficial wiki seems to have disappeared, the documentation
> of the "Jack and Loopback device as Alsa-to-Jack bridge" has gone with
> it. Neither google cache nor the wayback machine fare able to serve a
> copy of the page. There are plenty of references to the wiki page on
> the web, but no-one seems to have mirrored the page. Does anyone
> happen to have a copy of the documentation lying around?
>Michael
The documentation has just been back after some database backup issues faced by Mark Constable (maintainer of the WIKI page).
Maybe someone can mirror the content somewhere ? (jackaudio.org for example)
http://alsa.opensrc.org/Jack_and_Loopback_device_as_Alsa-to-Jack_bridge
Cheers!
J.
Hello, this is my first communication here.
I'm a former Windows user and recent Linux convert. After switching, I
noticed some utilities I regularly used in music production were missing
from the major repositories, simple things like tap-temp, delay/Hz
calculator, and note-to-frequency conversion. I was looking for an
excuse to learn programming so I started working on this "music toolkit"
of mine. It's all the stuff I need for making music calculations all in
one place (like a producer's Swiss Army knife). Maybe you have a use for
it too? Includes: tap-tempo, delay/Hz calculator, song time calculator,
note-to-frequency converter, simple frequency generator, and a metronome.
http://www.brianhilmers.com/code/rasp/
I'm a novice programmer and this is my first project. Advice and help is
welcome. Thanks.
Brian Hilmers
____________________________________________________________
53 Year Old Mom Looks 33
The Stunning Results of Her Wrinkle Trick Has Botox Doctors Worried
http://thirdpartyoffers.juno.com/TGL3131/502c2374d1a97237301abst02duc
Hi everyone,
as some of you might know already, the "Linux Audio Conference 2013"
will be back in Europe, this time hosted by the
Institute of Electronic Music and Acoustics (iem)
in Graz, Austria.
We are planning to do the conference during
9th - 12th of May 2013
We have checked a number of computer-music related conferences, and it
seems that none of them collides with this date, so *you* should be able
to attend!
We are still in an early stage of organization, but among the things to
expect are:
- inspiring paper sessions on linux centered audio
- interesting hands-on workshops
- wild electro acoustic concerts (possibly using our
higher-order-ambisonics systems for periphonic sound rendering)
- cool club nights
- fuzzy media art installations
- scenic trips to the country-side
- nice people
- numerous things i forgot
I will keep you informed on any news, regarding deadlines, registration,
a website and more.
Stay tuned!
nmfgadsr
IOhannes
---
Institute of Electronic Music and Acoustics (iem)
University of Music and Dramatic Arts, Graz, Austria
http://iem.at/
> I think you are in error considering these things mutually exclusive.
> Yes, hosts dealing with MIDI binding is how things should be done, but
> crippling a plugin API to not be able to handle MIDI is just that:
> crippling. Maybe I want to patch up a bunch of plugins to process MIDI
> events, or have some MIDI effect plugins: these are certainly
> reasonable things to do.
Hi dr,
I think we mis-communicated. MIDI is *fully* supported, including SYSEX,
delivered to plugins as raw unmolested bytes. Plugins can and do function as
MIDI processors.
The 'Guitar de-channelizer' is supplied as an example MIDI processor with
the SDK, as is the 'MIDI to Gate' plugin.
The idea of binding MIDI to the plugin's parameters is a purely optional
alternative.
> LV2 UIs are also like this, though there is an extension to provide a
> pointer to the plugin instance to the UI.
>
> In theory this should only be used for displaying waveforms and such,
> and always be optional.
How I display waveforms is the API has a function sendMessageToGui(), that
sends an arbitrary bunch of bytes to the GUI in a thread-safe manner. You
can build on that to send waveforms etc. Neither DSP nor GUI needs a
pointer to the other (but they can if they *really* want to).
> Your argument sounds very obviously right because it's about numeric
> parameters, but note and voice control is trickier. That involves
> inventing a new, better event format.
I will disagree and say MIDI note and voice control is pretty good,
*provided* you support MIDI real-time-tuning-changes ( this is an existing
MIDI SYSEX command that can tune any note to any fractional pitch in
real-time. AKA micro-tuning) ...and.. support "Key-Based Instrument Control"
(another little-known MIDI command that provides 128 per-note controllers ).
By supporting these two MIDI commands you get the familiarity of MIDI with
the addition of:
* Fractional Pitch.
* Per note controllers.
By binding MIDI to the plugin parameters as 32-bit 'float' via meta-data,
you remove the need to support MIDI explicitly, you kill the dependency on
MIDI's 7-bit resolution, and you remain open to in future extending the API
to support OSC.
GMPI parameter events include an optional 'voice-number', this extends the
MIDI-binding system to note events and polyphonic aftertouch. I can build an
entirely MIDI-free synthesiser, yet the metadata bindings make it fully
MIDI-compatible.
> This is precisely the kind of reason why monolithic non-
> extensible specifications suck.
GMPI is extensible too. For example MS-WINDOWS GUI's are provided as an
extension (so the core spec can be adapted to other platforms), as is
support for some SynthEdit specific feature that don't really belong in the
core spec.
Best Regards!,
Jeff
Hi,
If you would like to promote your Linux Audio Business or educational
facilities to a wider audience who are actively looking for information
about Linux Audio there is some banner real estate on
http://linux-audio.com which is available for a limited time free of
charge to early birds.
Please contact me directly and I will give you more details on the process.
--
Patrick Shirkey
Boost Hardware Ltd
Hi,
Thanks Rui for the 'Vee One prototypes', released as LV2 plugins and
standalone verions. http://www.rncbc.org/drupal/node/549
It's nice that the user has the choice of using the sampler/synth as a
plugin or as standalone application. Another nice thing in this release
is that you included Session Management support in the standalone
versions, JackSession in this case.
This makes me wondering:
Why does every developer makes his own little single instance host for
his standalone version of the LV2 plugins? Why isn't there a standard
single-instance LV2 host which can be used by all the developers for
their standalone versions of the LV2 plugins they make? A small host,
devs can link to and which works like some kind of run-time dependency
of the standalone version. Didn't have DSSI some kind of system like that?
One of the big advantages of this is that you could eliminate a large
part of the session problem in the Linuxaudio ecosystem. Every new
release of a LV2 standalone application is another application which
needs to be patched for some kind of Session Management. This is
cumbersome for devs and users.
If that standard single instance LV2 host supports Session Management by
default (NSM/Ladish/JackSession/whatever), you solve a significant part
of the problems you encounter when working with standalone Jack apps on
Linux.
1) Users have the choice to use the plugin as standalone version, with
the SM functionality;
2) Developers don't have to patch their standalone version with SM support;
3) Users have more freedom to use the SM they want, because most new LV2
standalone versions will support the most popular SM systems.
Best regards,
Dirk (alias Rosea Grammostola)