Yes, thanks FFanci, i know about Csound - i wrote that Csound example :)
But I was looking for something like Qmidiroute but then be able to convert a midicontroller to midi output channel.While i know it can easily be done with Csound, the guy that will work on that particular computer does not know anything about Csound. I was hoping there was some kind of VST or standalone program (GUI or CLI) that could accomplish this too.Anyone? -------------My music----------------- http://www.jamendo.com/de/album/6789/http://www.jamendo.com/en/album/7428/ -------------Work----------------- http://www.geluidsmanvanhetnoorden.nl/
Hi everyone,
I am new to this list, but have been using Ardour for years. I also
contributed the original code for the Automation All feature for
effects plugins in Ardour. The code was promptly rewritten by the lead
developer though :P
I would like to introduce the debut album Ellipsis by Telephone Sound,
which was composed, recorded, and produced entirely with Ardour and
other open-source tools (ZynAddSubFx, Hydrogen, etc.):
http://telephonesound.com
It is electronic-y, so check it out if you like that kind of stuff.
You can listen to (and download) one of the tracks on SoundCloud:
https://soundcloud.com/defcronyke/telephone-sound-benevolent-gods
That track is released under the cc-by-sa license.
You can also listen to previews of the other tracks and buy the album
on Google Play Music:
https://play.google.com/store/music/artist?id=A47w7csthbrjo5gsu65r4czh2fm
I am subscribed to the mailing list now, so let me know what you think
of it. Now it's time to start work on the next album!
~Jeremy Carter
Hello,
in the latest weeks I've been writing a filter/mapper for Novation Launch
Control XL. It allows you to create custom configurations assigned to its
templates, select which midi device send any controller to, and even
filter/modify the incoming MIDI event using mididings units.
It's still a work in progress (so, bugs!), but it should be stable enough
for now.
The program is called NoLaE and requires python 2.7, PyQt4 and mididings.
The project page is here: https://github.com/MaurizioB/nolae
If you own a Launch Control XL and would like to test it, I'd be happy to
have some feedback.
Thank you!
MaurizioB
Hi all!
I'm trying to find out what MIDIKEY number this weird controller I have
is sending. Sure enough, I can try them all from 1 to 127, but I seem to
remember an utility to do that?
It would speed up the process a lot :)
yPhil
--
Yassin "xaccrocheur" Philip
http://manyrecords.comhttp://bitbucket.org/xaccrocheur / https://github.com/xaccrocheur
(sorry for cross-posting; please distribute)
The IEM – Institute of Electronic Music and Acoustics – in Graz, Austria
is happy to announce its call for the 2017 Artist-in-Residence program.
http://residency.iem.at/
The residency is aimed at individuals wishing to pursue projects in
performance, composition, installation and sound art, development of
tools for art production and related areas. Individuals are asked to
submit a project proposal that is related to the fields of artistic
research of the IEM, as:
* Spatialization/higher-order Ambisonics
* Sonic Interaction Design
* Audio-visuality
* Algorithmic Composition
* Algorithmic Experimentation
* Standard and non-standard Sound Synthesis
* Live Coding
Duration of residency: 5 months
Start date: June 1st 2017 (negotiable)
Monthly salary: approx. EUR 1100 (net)
*APPLICATION DEADLINE: 1st of October 2016*
_The Institute:_
The Institute of Electronic Music and Acoustics is a department of the
University of Music and Performing Arts Graz founded in 1965. It is a
leading institution in its field, with more than 25 staff members of
researchers and artists. IEM offers education to students in composition
and computer music, sound engineering, contemporary music performance
and musicology. It is well connected to the University of Technology,
the University of Graz as well as to the University of Applied Sciences
Joanneum through three joint study programs.
The artwork produced at IEM is released through the Institute's own
OpenCUBE and Signale concert series, as well as through various
collaborations with international artists and institutions.
IEM's main activities are centered around the following research areas
* Computer Music
* Artistic Research
* Signal Processing and Acoustics
_What we expect from applicants:_
* A project proposal that adds new perspectives to the Institute's
activities and resonates well with the interests of IEM.
* Willingness to work on-site in Graz for the most part of the Residency.
* Willingness to exchange and share ideas, knowledge and results with
IEM staff members and students, and engage in scholarly discussions.
* The ability to work independently within the Institute.
* A dissemination strategy as part of the project proposal that
ensures the publication of the work, or documentation thereof, in a
suitable format. This could be achieved for example through the release
of media, journal or conference publication, a project website or other
means that help to preserve the knowledge gained through the Music
Residency and make it available to the public.
* A public presentation as e.g. a concert or installation, which
presents the results of the Artist Residency.
_What we offer:_
* 24/7 access to the facilities of the IEM.
* Exchange with competent and experienced staff members.
* A desk in a shared office space for the entire period and access to
studios including the CUBE which has a 24-channel loudspeaker system and
infrared motion tracking, according to availability.
* During the period from July 1st until end of September the resident
will have extensive access to the studios of the IEM.
* Regular possibilities for contact and exchange with peers from
similar or other disciplines.
* Infrastructure (electroacoustic music studios, icosahedral
loudspeaker array, motion capture technology).
* Concert and presentation facilities (CUBE 24 channel loudspeaker
concert space).
* A monthly salary of approx. EUR 1100 net per month in addition to
health and accident insurance.
_What we cannot offer to the successful applicant:_
* We can not provide any housing.
* We also cannot provide continuous assistance and support, although
the staff is generally willing to help where possible.
* We can not offer any additional financial support for travel or
material expenses.
An application form providing more information is available at
http://residency.iem.at/
Feel free to contact residency(a)iem.at <mailto:residency@iem.at>if you
have any
questions.
Greetings,
This is, as the title suggests, an improvisation. A sub-bass gives the
slow rhythm for acoustic guitar and synth improvisations. This was all
done within two hours, so no fiddling with controls, mixing, and all
that. Not sure yet how to handle the bass sounds, though. I think it
can be much better than what it is now and, I haven't played this on a
car system, or any other audio system, so have no idea how it fares in
the 'real world'.
I find there's some feeling that's issued from this, which makes it
clearly in the no-machine category. :)
https://soundcloud.com/nominal6/jam6
Hi folks!
Long time, so see! I'm back again after another heart attack and a diabetes
diagnosis. But I'm feeling better that I did even several years ago. Well,
thats enough self pithiness, here is my case: :-)
I have decided to test Seq24 again as a creative, effective and efficient MIDI
sequencing tool for composing and arranging. But over the years, I have found
it unstable enough to destroy my creative flow when I'm in that mood. That
situation also goes also for the 0.92 one at the Seq24 project site at
launchpad.net.
So before I try 0.93 and get it under my skin, how about Seq42 or even
something else? I'm very comfortable with Rosegarden, but Seq24 is probably
the fastest thing to use and setup when one quickly want to put some tracks
together for creative usage or arranging.
What do you find easiest and best to use when only focusing to make tracks and
arrangements as fast as possible? What do you use and what do you think?
Jostein
> http://www.pcworld.com/article/3077977/data-center-cloud/googles-magenta-pr…
>
> IMO, a lot of popular music is already machine-generated. ;)
i remember something along these lines from a few years back, an AI
project in the States that was far more advanced than Google's.
By using memory models, though, Google is making a good start. If
anyone can can find an article by George Rochberg called, "The
Avant-Garde and the Aesthetics of Survival," they will have an
excellent read.
One might observe that the machine wrote bad music. Well, humans are
already doing that, too, so Magenta has gotten at least that far! As
with chess machines, it may be a matter of time. Still, i'm not
holding my breath, and it's certainly time to listen to that Bartok
concerto.
il lupo