Dear Linux Audio developer, user, composer, musician, philosopher
and anyone else interested, you are invited to the...
Linux Audio Conference 2011
The Open Source Music and Audio Software Conference
May 6-8 2011
Music Department, National University of Ireland, Maynooth
Maynooth, Co.Kildare, Ireland
http://music.nuim.ie
As in previous years, we will have a full programme of talks,
workshops and music.
Two calls will be issued, a Call for Papers (see below) and Call for
Music (soon to be announced).
Further information will be found in the LAC2011 website (under
construction).
================ CALL FOR PAPERS =================
Papers on the following categories (but not limited to them) are now
invited for submission:
* Ambisonics
* Education
* Live performance
* Audio Hardware Support
* Signal Processing
* Music Composition
* Audio Languages
* Sound Synthesis
* Audio Plugins
* MIDI
* Music Production
* Linux Kernel
* Physical Computing
* Interface Design
* Linux Distributions
* Networked Audio
* Video
* Games
* Media Art
* Licensing
We very much welcome practical papers resp. software demos ("how I use
Linux Audio applications to create my music/media art").
Paper length: 4-8 pages, with abstract (50-100 words) and up to 5
keywords.
Language: English.
The copyright of the paper remains with the author, but we reserve the
right to create printed proceedings from all submitted (and accepted)
papers.
IMPORTANT DATES:
Submission deadline: 15/January 2011
Notification of acceptance: 7/March 2011
Camera-ready papers: 1/April 2011
Queries: Victor Lazzarini, NUI Maynnooth (victor.lazzarini(a)nuim.ie)
if there was a standard that described the expected behavior of commonly
used
(or just useful) knob control methods, perhaps in the form of a short draft
specification on freedesktop.org, would people (ie developers) use it?
expectations and requirements clearly differ so we would probably need to
gather together and discuss those in use with a mind to removing or
combining
similar methods, explicitly naming them and defining their behavior in the
abstract. if we required that applications implement a set few methods as
configurable options (as a minimum) in order to comply then we might
eventually
see the back of the unpredictable mess we have now. it wouldn't matter
how they
are configured, be it gconf, dotfile, command line, prefs dialog, env
variable..
as long as it could be done.
worth thinking about or is it just too niche? are the actual real world
applications too varied and specific to make it useful?
for example it would certainly be nice to crack open a library config
or application prefs window and assign VLMC (vertical linear motion
control*)
to the left mouse button and RMC (radial motion control*)
to shift + left mouse button, and have those conform to the expected
standards.
or even just to set KNOB_CONTROL_METHOD=RMC in your global environment
and have all the adherent applications just do what you expect.
it seems to me that standards (of the mutually agreed rather than officially
sanctioned variety, since the latter is impractical) provide the best means
to bring about common behavior in sovereign systems. naturally it would be
completely platform independent.
just having named control methods with explicitly defined behavior may
help matters
too.
cheers,
pete.
*crappy names i'm sure, but you get the idea.
WRT the recent discussion about pixmap knob widgets and theme
conformance (that i can't reply to since i wasn't on the list
at the time, sorry)
there are a couple of ways that you might achieve this.
the crux gtk theme engine includes some pixmap recolouring code
(or used to at any rate). it recolours areas of a pixmap that
only contain green values to one specified in the gtkrc.
this might conceivably be stolen and incorporated
to provide some measure of theme conformance for pixmap
based widgets (knobs wheels and potentially sliders).
this method places specific constraints on the source pixmaps used,
constraints that are easily adhered to when creating pixel art
(which the crux pixmaps were) but procedurally generated rasters
(vector or 3d renders) of the kind that are likely to be used with
a pixmap widget might pose more of a problem since lighting and
anti aliasing probably induce a variety of colours.
still, i expect you could get something to work with a bit of effort.
(imagemagick or scriptfu in the gimp may help there).
another possibility that was briefly discussed for use with
phat was that of a composite widget with different layers that
could be drawn separately, one on top of the other. eg render an SVG
or pixmap as a background on the first pass, then draw something
with cairo (a value indicator..) on top.
there are obvious limits to what you can achieve with this kind of
thing, but you could get some complex effects on a knob while still
maintaining procedural control over the size, colour and shape of the
vector elements. (tick marks around the knob, value indicator size
and colour etc). IIRC we discarded the idea due to it's complexity.
we wanted a generally configurable knob and the vector elements would
need anything from extensive widget options right up to a full blown
markup language to describe them (not a problem for app specific
widgets).
cheers,
pete.
Hi,
BoxySeq is still very far away from suitable for end users, but I've
decided to post an update here to let people know that I'm still
working on it :-)
"The classification of what BoxySeq is, resides somewhere between
sequencer and arpeggiator. The core concept of BoxySeq is to use a
window-manager-like window-placement-algorithm to generate pitch and
velocity data as it sequences events in real time (via the JACK Audio
Connection Kit’s MIDI API)."
More details of how it (should) work(s):
http://github.com/jwm-art-net/BoxySeq/wiki
My latest demo with BoxySeq connected to Yoshimi and recorded in mhWaveEdit
http://jwm-art.net/art/audio/boxyseq_demo_28_09_2010.ogg
(the demo utilizes a simple pattern which feeds into 8 boundary boxes.
i move the boundaries around, switching them to blocking mode and back
to play mode for various melodic arpeggiated effects etc).
Since the last time I posted to LAD/LAU I've increased some usability features:
Improved boundary movement/resize code.
Added zoom functionaility
Added scrollbars.
Added keyboard shortcuts.
(select a boundary by hovering mouse over it)
left,right,up,down move selected boundary 1 unit in that direction
b, B - make the boundary turn all events into Blocks (ie events which
don't emit midi messages but are still placed)
i, I - make the boundary Ignore all events.
p, P - make the boundary Play all events as normal _
- - bring the boundary closer to the front of the boundary
list\___slightly confusing I know
+ - move the boundary further toward the end of the list _/
that's about it - but it feels like I've done a whole lot more.
the next step is to get one of the most fundamental features of
BoxySeq working, and that is to allow the user to create static block
boxes (these can be placed anywhere in the grid and prevent a boundary
placing any events in that location). Unfortunately it's not quite as
straightforward as it sounds.
Cheers,
James.
--
_
: http://jwm-art.net/
-audio/image/text/code
Friends, MusE 1.1 is here!
[Introduction]
MusE is a combined midi and audio sequencer which tries
to cover most bases for the linux computer studio.
MusE is one of the oldest sequencers on the Linux audio scene and is
today a very stable open source solution for everyday music making.
This release adds some new features, lots of bugfixes and a bunch
of usability improvements.
  MusE : http://muse-sequencer.org
[Highlights]
* Jack midi support.
* Allow native VST guis for plugins
* Audio and midi routing popup menus now stay open, for making rapid
connections.
* MusE now has two mixers, with selectable track type display.
* External midi sync fixes and improvements, should be very stable
* Some pianoroll improvements
* Some crash fixes
* Drum editor fixes
* Various arranger fixes and improvements
* Various improvements for plugin guis
* Routing fixes
* Stability fixes for plugins
* Various DSSI fixes
* Rec enabled track moves with selection when only one track is rec enabled
* Jack midi, routing system, multichannel synth ins/outs, midi strips
and trackinfo pane.
* Dummy audio driver: Added global settings for sample rate and period size.
* Arranger track list: Quick 'right-click' or 'ctrl-click' or
'ctrl-mouse-wheel' toggling of Track On/Off.
* Allow changing timebase master
* Option to split imported midi tracks into multiple parts.
* Several new keyboard shortcuts for various operations, see shortcut editor
* Several colour tweaks and other cosmetic changes
* Various stability fixes
* Countless fixes and tweaks, about a 300 lines in the Changelog,
 check it for a complete list of blood sweat and tears
[What is MusE again?]
MusE is multitrack virtual studio with support for:
* Midi
 * jack midi
 * internal softsynths, including soundfont player FluidSynth
 and sample player Simple Drums
 * DSSI softsynths, including VST instruments
 * with a patch to DSSI, VST-chunks are handled
 * Drum editor
 * Pianoroll
 * Conventional arranger
 * midi automation
 * and lots more
* Audio
 * Jack
 * Jack transport
 * LADSPA plugins
 * VST plugins through dssi-vst
 * audio automation, old sch00l
 * and lots more
[ChangeLog]
For a complete list of changes, check the ChangeLog in
the package or online at the sourceforge site:
http://lmuse.svn.sourceforge.net/viewvc/lmuse/trunk/muse/ChangeLog?revision…
[Download]
http://muse-sequencer.org/index.php/Download
Keep on rocking!
The MusE team
I think the answer belongs to the list. Maybe others will correct me...
On Sunday 26 September 2010 12:04:17 you wrote:
> Hi Arnold, your hint is a very revelation to me! I have spent the night
> thinking about it and now I have a question: if I drive the beat counter
> via the sampling-clock (you mean the internal clock of the sound card,
> right?) and the alsa process is in a blocking mode, the audio-thread
> become itself a sort of metronome where a chunk of data is a single tick,
> doesn't it?
The audio-thread becomes the metronome. But don't mix the chunks of data with
the tick of your bars:beats:ticks.
You get block of samples from the device (to write to or read from), you know
what samplingrate you use, you know how many samples you already processed.
From that you calculate your clock.
BTW: It sounds as if you are just beginning to write audio-apps: Start with
jack, its api for clients is easier then alsa's. At least that what I'm told,
I never used the alsa-api.
Another advantage of jack is that you get the global jack-transport for free.
Which means your sampler/looper will sync with your other soft-synths and with
your recording app.
Have fun,
Arnold
Hi guys,
first of all forgive my not-so-perfect English :-)
I'm writing down some code for a minimal loop player based on two threads: one handles a beat counter, the other feeds the soundcard with audio frames, through ALSA. When the beat counter has completed a full cycle (e.g. 4/4) it simply rewinds the PCM data to byte 0 making a seamless loop. Really straightforward.
Now, I'm wondering how to implement the metronome side: should I rely on something like usleep/nanosleep or ALSA layer could offer an advanced timer? Another potential issue would come from latency, obviously present within the audio thread (due to ALSA): what happens when the beat counter restarts the audio sample but an alsa frame is still being written to the soundcard?
Thank you in advance for any suggestion!
Tb
After a flurry of belated Spring cleaning, I am happy to
announce updated versions of the following DSSI plugins
and host:
Xsynth-DSSI version 0.9.4, an analog-style (VCOs-VCF-VCA)
DSSI synth:
http://dssi.sourceforge.net/download.html#Xsynth-DSSI
WhySynth version 20100922, a versatile DSSI synth:
http://www.smbolton.com/whysynth.html
ghostess version 20100923, a lightweight GTK+ DSSI host:
http://www.smbolton.com/linux.html
New in these releases:
* GUI knobs now use cairo (when available) for smooth, anti-
aliased rendering.
* Patch and configuration file handling is now more graceful
in its handling of different locales.
* WhySynth: new minBLEP oscillator waveform (Clipped Saw).
* WhySynth: new effect (Sean Costello's Csound reverb).
* Six months' to a year's worth of unreleased bug fixes and
code cleanups.
Have fun,
-Sean Bolton