Hi,
Does anyone know a timecode library, that allows converting, adding,
subtracting SMPTE timecode with knowledge about drop-frame timecode, etc
...that can be used in C programs.
The only timecode lib I was able to find is 'mffm-timecode'.
It is C++ only and not really concerned with SMPTE timecode.
I'm thinking about writing one - unless I can find sth suitable.
Five of my current projects include replicated code for dealing with
SMPTE. Some use libltcsmpte.sf.net functions to some extent.
I'm currently refactoring libltcsmpte. As it was the first lib I've ever
written, I did make a couple of mistakes in designing its API/ABI. The
new incarnation of libltc only concerns itself with LTC frames and drop
SMPTE. -- So I'll need a libsmpte of some kind.
TIA,
robin
> Since the unofficial wiki seems to have disappeared, the documentation
> of the "Jack and Loopback device as Alsa-to-Jack bridge" has gone with
> it. Neither google cache nor the wayback machine fare able to serve a
> copy of the page. There are plenty of references to the wiki page on
> the web, but no-one seems to have mirrored the page. Does anyone
> happen to have a copy of the documentation lying around?
>Michael
The documentation has just been back after some database backup issues faced by Mark Constable (maintainer of the WIKI page).
Maybe someone can mirror the content somewhere ? (jackaudio.org for example)
http://alsa.opensrc.org/Jack_and_Loopback_device_as_Alsa-to-Jack_bridge
Cheers!
J.
Hello, this is my first communication here.
I'm a former Windows user and recent Linux convert. After switching, I
noticed some utilities I regularly used in music production were missing
from the major repositories, simple things like tap-temp, delay/Hz
calculator, and note-to-frequency conversion. I was looking for an
excuse to learn programming so I started working on this "music toolkit"
of mine. It's all the stuff I need for making music calculations all in
one place (like a producer's Swiss Army knife). Maybe you have a use for
it too? Includes: tap-tempo, delay/Hz calculator, song time calculator,
note-to-frequency converter, simple frequency generator, and a metronome.
http://www.brianhilmers.com/code/rasp/
I'm a novice programmer and this is my first project. Advice and help is
welcome. Thanks.
Brian Hilmers
____________________________________________________________
53 Year Old Mom Looks 33
The Stunning Results of Her Wrinkle Trick Has Botox Doctors Worried
http://thirdpartyoffers.juno.com/TGL3131/502c2374d1a97237301abst02duc
Hi everyone,
as some of you might know already, the "Linux Audio Conference 2013"
will be back in Europe, this time hosted by the
Institute of Electronic Music and Acoustics (iem)
in Graz, Austria.
We are planning to do the conference during
9th - 12th of May 2013
We have checked a number of computer-music related conferences, and it
seems that none of them collides with this date, so *you* should be able
to attend!
We are still in an early stage of organization, but among the things to
expect are:
- inspiring paper sessions on linux centered audio
- interesting hands-on workshops
- wild electro acoustic concerts (possibly using our
higher-order-ambisonics systems for periphonic sound rendering)
- cool club nights
- fuzzy media art installations
- scenic trips to the country-side
- nice people
- numerous things i forgot
I will keep you informed on any news, regarding deadlines, registration,
a website and more.
Stay tuned!
nmfgadsr
IOhannes
---
Institute of Electronic Music and Acoustics (iem)
University of Music and Dramatic Arts, Graz, Austria
http://iem.at/
> I think you are in error considering these things mutually exclusive.
> Yes, hosts dealing with MIDI binding is how things should be done, but
> crippling a plugin API to not be able to handle MIDI is just that:
> crippling. Maybe I want to patch up a bunch of plugins to process MIDI
> events, or have some MIDI effect plugins: these are certainly
> reasonable things to do.
Hi dr,
I think we mis-communicated. MIDI is *fully* supported, including SYSEX,
delivered to plugins as raw unmolested bytes. Plugins can and do function as
MIDI processors.
The 'Guitar de-channelizer' is supplied as an example MIDI processor with
the SDK, as is the 'MIDI to Gate' plugin.
The idea of binding MIDI to the plugin's parameters is a purely optional
alternative.
> LV2 UIs are also like this, though there is an extension to provide a
> pointer to the plugin instance to the UI.
>
> In theory this should only be used for displaying waveforms and such,
> and always be optional.
How I display waveforms is the API has a function sendMessageToGui(), that
sends an arbitrary bunch of bytes to the GUI in a thread-safe manner. You
can build on that to send waveforms etc. Neither DSP nor GUI needs a
pointer to the other (but they can if they *really* want to).
> Your argument sounds very obviously right because it's about numeric
> parameters, but note and voice control is trickier. That involves
> inventing a new, better event format.
I will disagree and say MIDI note and voice control is pretty good,
*provided* you support MIDI real-time-tuning-changes ( this is an existing
MIDI SYSEX command that can tune any note to any fractional pitch in
real-time. AKA micro-tuning) ...and.. support "Key-Based Instrument Control"
(another little-known MIDI command that provides 128 per-note controllers ).
By supporting these two MIDI commands you get the familiarity of MIDI with
the addition of:
* Fractional Pitch.
* Per note controllers.
By binding MIDI to the plugin parameters as 32-bit 'float' via meta-data,
you remove the need to support MIDI explicitly, you kill the dependency on
MIDI's 7-bit resolution, and you remain open to in future extending the API
to support OSC.
GMPI parameter events include an optional 'voice-number', this extends the
MIDI-binding system to note events and polyphonic aftertouch. I can build an
entirely MIDI-free synthesiser, yet the metadata bindings make it fully
MIDI-compatible.
> This is precisely the kind of reason why monolithic non-
> extensible specifications suck.
GMPI is extensible too. For example MS-WINDOWS GUI's are provided as an
extension (so the core spec can be adapted to other platforms), as is
support for some SynthEdit specific feature that don't really belong in the
core spec.
Best Regards!,
Jeff
Hi,
If you would like to promote your Linux Audio Business or educational
facilities to a wider audience who are actively looking for information
about Linux Audio there is some banner real estate on
http://linux-audio.com which is available for a limited time free of
charge to early birds.
Please contact me directly and I will give you more details on the process.
--
Patrick Shirkey
Boost Hardware Ltd
Hi,
Thanks Rui for the 'Vee One prototypes', released as LV2 plugins and
standalone verions. http://www.rncbc.org/drupal/node/549
It's nice that the user has the choice of using the sampler/synth as a
plugin or as standalone application. Another nice thing in this release
is that you included Session Management support in the standalone
versions, JackSession in this case.
This makes me wondering:
Why does every developer makes his own little single instance host for
his standalone version of the LV2 plugins? Why isn't there a standard
single-instance LV2 host which can be used by all the developers for
their standalone versions of the LV2 plugins they make? A small host,
devs can link to and which works like some kind of run-time dependency
of the standalone version. Didn't have DSSI some kind of system like that?
One of the big advantages of this is that you could eliminate a large
part of the session problem in the Linuxaudio ecosystem. Every new
release of a LV2 standalone application is another application which
needs to be patched for some kind of Session Management. This is
cumbersome for devs and users.
If that standard single instance LV2 host supports Session Management by
default (NSM/Ladish/JackSession/whatever), you solve a significant part
of the problems you encounter when working with standalone Jack apps on
Linux.
1) Users have the choice to use the plugin as standalone version, with
the SM functionality;
2) Developers don't have to patch their standalone version with SM support;
3) Users have more freedom to use the SM they want, because most new LV2
standalone versions will support the most popular SM systems.
Best regards,
Dirk (alias Rosea Grammostola)
Hello fellas!
You probably bumped into one or two videos of AudioGL, a sort of modular
DAW (http://youtu.be/bCC9uHHAEuA).
Of course, none of us knows whether the video is actually true, whether it
is as as smooth as it appears to be,
whether it really does work that well as seen in the demo.
However, assuming that it is, it does display a rather robust music
production environment.
I have two questions.
1. How is it possible that such complex functionality seems to run that
smoothly? Or does it look more complex than it really is? The GUI seems
very responsive and animated.
2. The project seems to be developed just by one person, yet it looks like
a very functional music sequencer. Am I missing something here or is it a
matter of personal talent and commitment?
Cheers!
--
Louigi Verona
http://www.louigiverona.ru/
Hi all,
I've working on a LV2 instrument plugin, and it consumes about 1-2% CPU on
idle. When I leave it for about 20 seconds, the CPU usage jumps to 38 / 40
% of a core, and JACK xruns. The code contains IIR's for a reverb effect,
so I'm going to blame this CPU burning on denormal values.
I'm using waf as the build system, and appending "-O3" and "-ffast-math" to
the CFLAGS and CXXFLAGS. Building with ./waf -v shows the runner thread to
have the "-O3" and "-ffast-math" in the command.
Yet when I run it it still hogs CPU after about 10-20 seconds.
Reading on gcc's pages (http://www.acsu.buffalo.edu/~charngda/cc.html) tells
me that if DenomalsAreZero and FlushToZero are set, it should be linked
with crtfastmath.o. I don't know how to check if this is happening?
I'm not sure where to look next to fix the problem. Help appreciated!
-Harry
I have adapted the GMPI requirements final draft document to a
comparison with the current state of LV2: http://lv2plug.in/gmpi.html
A couple of nonsense baroque ideas aside, most of the requirements are
met, though there are still important gaps. I mention it here in case
anyone has an interest, please feel free to address any of the points
made in this document.
It may also be useful to augment this document with additional
requirements, particularly since there's several knowledgeable folks who
may not grok the *how* of LV2 but know *what* they need in terms of
general requirements. I will add a section for this if anybody has any
input.
Perhaps this will serve as a good road map that is not too bogged down
with details.
Cheers,
-dr