Hey All,
I'm looking to improve a program's design with regards to how it
communicates between the "jack" thread and its "main/gui" thread. Please
note I'm *not* looking for implementation details like what ringbuffer to
use, this has been discussed here before.
Conditions:
Gui needs to feed data trough to the jack thread (data = parameter moves
etc)
Jack thread needs to push data (buffers for waveforms & "playhead" info)
The real question:
What is a neat solution to passing various data trough a ringbuffer?
My (hacky?) solution: Create a class, call it "Event". Runtime now looks
like so:
1. Create a EventType enum, set the details
2. Write those "Events" into the ringbuffer
3. Switch based on EventType, and handle the event.
While not terribly ugly, that Event class starts to get bigger & nastier, so
I concidered sub-classing it... but I'm not sure this is going in the right
direction.
I'm very intrested how the "big" programs around have approached this
problem... Cheers, -Harry
Hey All,
I'm faced with a problem that I can't see an easy way around regarding the
use of the IR reverb plugin.
I'm running the GUI in a seperate *process*, and doing all lv2 communication
over OSC. For the most part this is easy,
the problem rears its head when one want's to use a plugin whose UI requires
"instance-access".
Basically, the IR plugin GUI needs access to the Lv2_Handle. But I can't
provide that due to the OSC communication.
So I've concidered "spoofing" a plugin on the UI side, and keeping it up to
date with what the "real" one is doing in the Engine.
Bit ugly, and if the UI has instance access, will it still call the normal
"port" events..? Because otherwise I'm lost with trying to
get at the UI data.
The other problem is that loading a "sample" into the IR convolution happens
in a pretty strange way: there's 3 Control Input ports, and together they
make up a 64bit file hash.
I understand the reasons behind this decision, and I'm not trying to
criticise the implementation, I'm just not sure how I can send a certain
file to these ports to make it work...
-Harry
On 09/02/2011 12:54 PM, Pedro Alves wrote:
> boost::function<void(void)> serves this purpose for me in jass. To
>> create and pass a functor that assigns a new auditor generator to the
>> one in the engine, and then tells it to play i do for example:
>>
>> write_blocking_command(assign(engine_.auditor_gen, p));
>> write_blocking_command(boost::bind(&engine::play_auditor,
>> boost::ref(engine_))); assign() is just a utility template to make
>> creating functors that do assignments easier.. boost::bind is used to
>> make all passed functors 0-ary (e.g for binding member functions to
>> their instance or binding arguments to the functor.. and
>> write_blocking_command is just a utility function that disables the GUI
>> until the acknowledgement from the engine has come back to the GUI,, The
>> command ringbuffer is just a ringbuffer holding
>> boost::fucntion<void(void)> objects..
>>
>> typedef ringbuffer<boost::function<void(void)> > command_ringbuffer;
>>
>> Examples from here:
>> https://github.com/fps/jass/blob/master/main_window.h
>> https://github.com/fps/jass/blob/master/assign.h
>> https://github.com/fps/jass/blob/master/engine.h Regards, Flo
> There are alternatives (even predecessors) to boost.function that
> are much faster and avoid the heap/new. See
> <http://www.codeproject.com/KB/cpp/fastdelegate2.aspx> for example.
> (I used Don's original "Fastest Possible C++ Delegates" on an embedded
> project years ago -- worked great).
>
Oh, thanks for the interesting read :D
Flo
Hi all
My name is Thijs and i have been helping out with Hydrogen for some time now
(mainly manual, website and a tiny bit of coding)
Anyway, since jack_session support was recently added to Hydrogen (in V0.9.6
that is currently still in development but available from svn via
hydrogen-music.org) i decided to test jack_session a bit and i must say i'm
really exited about this :-)
IMHO this is the most critical piece of the puzzle and it makes linux audio
_so_ much more usable! so lots of KUDOS to everyone that has been working
on it !!
I have done several 'intro sessions' for Mac audio users and they are
generally really impressed with what they see, but the lack of a good
session management mechanism is a real showstopper for them... (i know
there are other options, but none of those have ever allowed me to truly
save my session *at any given moment*)
however, the first thing that slowed me down was simply knowing what apps
already support jack_session
i know that jack_session is fairly new, so it's normal that some info is
missing, but it would be great if the websites out there that list the
available linux audio apps (like
http://apps.linuxaudio.org/apps/categories/jack) would also indicate if the
app supports jack_session
the jack_session page on the jackaudio.org
wiki<http://trac.jackaudio.org/wiki/WalkThrough/User/jack_session>also
lists a number of apps, but i don't think it's up to date, is it ?
any comments?
grtz
Thijs
PS: i'd be happy to help out with documenting jack_session if needed
PS: it would be nice to have a 'jack_session enabled app' banner to put on
the site of the apps that support it. graphical designers ?
--
follow me on my Audio & Linux blog <http://audio-and-linux.blogspot.com/> !
Brand new LV2 plugin soft synth here. The Newtonator (v0.5.0) features:
 * LV2-compliant synth plugin
 * Simple, easy-to-use GTK-based GUI
 * 3-note polyphony (err, um, until I can do some serious optimization)
 * "Unique" synthesis algorithm (Tuna Pagan Fellowship)
The Newtonator specializes in making crazy, harsh sounds, so if you're
looking for some sounds to produce the next Yanni album, keep
looking. Please refer to the user manual
(http://newtonator.sourceforge.net/manual/index.html) for more gory
details.
Download: http://sourceforge.net/projects/newtonator/files/newtonator-0.5.0.tar.gz/do…
NOTE: This plugin was written against the excellent libraries included in
the lv2-c++-tools package written by Lars Luthman. However, as of v1.0.4,
the GUI library only supports rev. 2 of the LV2 specification, so it may
have problems on newer hosts (though I have done some (very) preliminary
testing against Ardour3 and it seems to work ok). Alpha testing mostly
done against zynjacku and Elven (comes with ll-plugins package) hosts.
This puppy's in beta, so feel free to get in touch with me at the project
forum (https://sourceforge.net/projects/newtonator/forums/forum/1820765)
or submit a bug if needed.
Thanks, and have fun.
Michael Bechard
https://sourceforge.net/projects/newtonator/
Hi,
during the process of writing a new small jack sampler which fits my
workflow I came up with this little scheme to solve the UI/engine
decoupling problem. For the purpose of spreading the idea or
alternatively getting answers about how it's broken and sucks I decided
to write a little article describing it..
http://178.63.2.231/~tapas/wordpress/?page_id=45
The (largely unfinished and unusable) sampler project is here:
https://github.com/fps/jass
Let me have it..
Regards,
Flo
All:
A recruiter approached me looking for an ALSA driver developer to
enhance an existing driver. This is a short-term, off-site gig. Some
details below. Please contact Mr. DeAmelio directly.
Thanks,
Gabriel
---------- Forwarded message ----------
From: David DeAmelio <david_deamelio(a)oxfordcorp.com>
Date: Tue, Aug 16, 2011 at 2:29 PM
Subject: Job Description - Oxford International
To: gabrbedd(a)gmail.com
Hey Gabriel,
Here’s the description….Thanks.
Location: OFFSITE
Start Date: ASAP
Duration: 1-2 Weeks
Manager needs consultant to come onsite for 1st few days. Client will
pay for flight there and back.
Must:
Linux ALSA Device Drivers—Alsa- advanced linux sound architecture
Blackfin
Plus:
D2D: Consultant will be enhancing an audio driver to support a voice
quality recorder
__________________________________
David De Amelio
Team Lead
Software/Hardware Consulting Services
Oxford International
  a division of Oxford Global Resources
100 Cummings Center Suite 206N
Beverly, MA 01915
877.258.9982 x 7608 Office
978.922.7547 FAX
david_deamelio(a)oxfordcorp.com
www.oxfordcorp.com
Oxford Europe
The Right Talent. Right Now.®
This e-mail, and any attachments thereto, is intended only for the
addressee(s) named herein and may contain privileged and/or
confidential information. If you are not the intended recipient, you
are hereby notified that any dissemination, distribution or copying of
this e-mail, and any attachments thereto, is strictly prohibited. If
you have received this e-mail in error, immediately notify the sender
by return e-mail and permanently delete the original, including any
attachments thereto, and any copy and printout thereof.
Hello everyone,
Every one knows Yoshimi, the fork of ZynAddSubFx.
One thing was lacking to yoshimi to be perfect: to be nearly fully
controlled by midi controls ( no OSC, sorry ).
ZynAddSubFx had possibilities to control a few parameters with
complicated NRPN, Yoshimi recently had ( in the test versions ) some
features too.
But now I'm proud to announce you the work of licnep ( not me, I'm just
a bug reporter ) who made the "midiLearn" function for yoshimi. It's not
stable for now because it's recent, and not full, but here are the
present features:
* Control System effects, Part Insert Effects
* Master/Part Volume, Pan, System Effect Sends
* Most of ADsynth parameters
* Add/Remove controller
* detect the channel and the number
* reset the knob ( its position )
I think it's a feature that's very useful and could help many
yoshimi/zyn users.
To use it, that's simple: connect your controller to yoshimi,
right-click on a blue knob ( yellow are ones which are not supported for
now ) and click "midi Learn" move your controller, it detects
automatically the controller.
To see and modify controllers, go to the Yoshimi> MIDI controllers menu.
To erase midi control of a knob, simply right click on it and click on
"remove midi control"
Here is the gitHub repository: https://github.com/licnep/yoshimi
To download and install it, follow the explications link ( gitHub ):
https://github.com/licnep/yoshimi/wiki/How-to
A light page to understand how to control others not implemented
controllers:
https://github.com/licnep/yoshimi/wiki/Source-code
Pages to follow the news of the project:
Facebook: https://www.facebook.com/pages/Yoshimi-midi-learn/224823617534934
Twitter: http://twitter.com/#!/YoshimiMIDI
So if you're interrested, all bug requests are deeply recommended.
Cheers,
Louis CHEREL.