This is the initial release of XPolyMonk.lv2, a polyphonic version of
Xmonk.lv2.
XPolyMonk comes with 12 voices, full midi support and a integrated
virtual keyboard.
It use libxputty to create the interface:
https://github.com/brummer10/libxputty
The dsp part is heavily based on the FAUST `SFFormantModelBP` from
physmodels.lib
XPolyMonk is licensed under the BSD Zero Clause License, so you could do
with it what ever you like.
You'll find it's development source code here:
https://github.com/brummer10/XPolyMonk.lv2
and the release here:
https://github.com/brummer10/XPolyMonk.lv2/releases
Happy Xmas to all
hermann
**
*(Apologies for cross-postings)*
*
ICAD 2020
Call for Submission of Papers, Extended Abstracts, Workshops, and
Tutorials
26th International Conference on Auditory Display
University of Florida, Gainesville, FL, USA
June 7-11, 2020
THEME: “SAFE AND SOUND”
Sound is used in a wide variety of applications to alert listeners to
the status of a person or environment. At ICAD 2020, we want to
highlight sonification work that is used to maintain awareness in some
capacity (outside navigation, hospitals, air traffic control, etc).
Papers are not limited to this theme, as we will value and embrace all
types of submissions, including papers, posters, multimedia
(videos/audios), demos, and concert pieces.
First held in 1992, ICAD is a highly interdisciplinary academic
conference with relevance to researchers, practitioners, musicians, and
students interested in the design of sounds to support tasks, improve
performance, guide decisions, augment awareness, and enhance
experiences. It is unique in its singular focus on auditory displays and
the array of perception, technology, and application areas that this
encompasses. Like its predecessors, ICAD 2020 will be a single-track
conference, open to all, with no membership or affiliation requirements.
ICAD 2020, the 26th International Conference on Auditory Display, will
be held at the University of Florida, June 7 to 11, 2020. The graduate
student ThinkTank (doctoral consortium) will be held on Sunday, June 7,
before the main conference.
PAPERS AND EXTENDED ABSTRACTS
The ICAD 2020 committee is seeking papers and extended abstracts that
will contribute to knowledge of how sonification can support awareness
in various contexts. For details on topics of interest, proposal format,
submission instructions, and additional conference information please
visit http://icad2020.icad.org, where details will be updated as they
are made available.
WORKSHOPS AND TUTORIALS
ICAD workshops and tutorials provide in-depth opportunities for
conference attendees to discuss and explore important aspects of the
field of auditory display with like-minded researchers and
practitioners. Sessions can range from applications and programming
methodologies to interdisciplinary research skills, emerging research
areas, and challenge problems, to sonification/compositional Maker-sessions.
IMPORTANT DATES:
*
Monday,March 2,2020 - Deadline for submission of full papers
*
Monday, March 9,2020- Deadline for submission of workshops and tutorials
*
Monday, March 9,2020- Deadline for submission to student think tank
*
Monday, April 6,2020- Deadline for submission to sonification
concert and installations
*
Friday, April 10, 2020 - Notification of decisions
*
Monday,April 20, 2020- Extended abstract submission(some full paper
submissions may be recommended for the extended abstract category)
Papers Chair - Bruce Walker - icad2020papers(a)icad.org
<mailto:icad2020papers@icad.org>
Workshop Chair - Derek Brock - icad2020workshops(a)icad.org
<mailto:icad2020workshops@icad.org>
Sponsorship Chair - Myounghoon (Philart) Jeon -
icad2020sponsorship(a)icad.org <mailto:icad2020sponsorship@icad.org>
Think Tank Chair - Areti Andreopoulou - icad2020thinktank(a)icad.org
<mailto:icad2020thinktank@icad.org>
Communications Chair - Katie Wolf - icad2020accessibility(a)icad.org
<mailto:icad2020accessibility@icad.org>
Steering Chair - Matti Gröhn - icad2020steering(a)icad.org
<mailto:icad2020steering@icad.org>
Conference Chair - Kyla McMullen - icad2020chair(a)icad.org
<mailto:icad2020chair@icad.org>
--
Kyla McMullen
Chair of ICAD 2020: http://icad2020.icad.org
*
--
_______________________________________________________________
Kyla McMullen, PhD| Assistant Professor
University of Florida
Computer and Information Science and Engineering
E301 CSE Bldg
P.O. Box 116120
Gainesville, FL 32611
Web - http://www.kylamcmullen.com
Appointments - http://calendly.com/kyla-mcmullen
Now out!
For ALSA MIDI there is a search mode where Yoshimi will connect to every
readable source it can find.
There has been a correction to Scales note and frequency allocation terminology.
The mixer panel format can now be changed from Yoshimi->Settings.
The CLI has additional controls, particularly in banks and roots management.
Other stuff.
We've had to do an update to allow for the change in the MXML API.
Dots and dashes in grids are now solid lines. They render much faster.
Session/State/Config managment is now unified. More system settings can be
changed by these files as Yoshimi instances start.
The Advanced User Manual has been significantly updated.
More details on all of these are in /doc/Yoshimi_1.6.1_features.txt
Yoshimi source code is available from either:
https://sourceforge.net/projects/yoshimi
Or:
https://github.com/Yoshimi/yoshimi
Full build instructions are in 'INSTALL'.
Our list archive is at:
https://www.freelists.org/archive/yoshimi
To post, email to:
yoshimi(a)freelists.org
--
Will J Godfrey
http://www.musically.me.uk
Say you have a poem and I have a tune.
Exchange them and we can both have a poem, a tune, and a song.
The stable version of fluidsynth 2.1 has been released, featuring a
new reverb engine, stereophonic chorus, support for DLS, and more.
Details can be found in the release notes:
Download: https://github.com/FluidSynth/fluidsynth/releases/tag/v2.1.0
API: http://www.fluidsynth.org/api/
Website: http://www.fluidsynth.org
FluidSynth is a real-time software synthesizer based on the
SoundFont(tm) 2 specifications. It can read MIDI events from the MIDI
input device and render them to the audio device. It can also play
MIDI files.
Tom Moebert
FluidSynth Developer Team