[LAD] making sense of Jack MIDI; or, is this an appropriate use for Jack?

Pedro Lopez-Cabanillas pedro.lopez.cabanillas at gmail.com
Sat Feb 16 17:21:59 UTC 2013


On Saturday 16 February 2013 13:15:51 Pedro Lopez-Cabanillas wrote:
> On Friday 15 February 2013 15:19:03 M Donalies wrote:
> > If I want to support 
> > both audio and midi, then I have to learn 2 completely different and
> > conflicting  api's.
>  
> Audio and MIDI are two totally different concepts from the beginning. You
> can mix both of them in your application, but to do so you are going to
> choose between one or the other point of view, and the chosen alternative
> is going to be determinant on your application's functionalities and which
> users are going to be happy with(out) them. 
> I like to explain this issue with an analogy: it is similar to the image
> world, where you have vector graphics (MIDI) and bit-mapped or raster
> graphics (digital audio). There are Linux programs working from both points
> of view, for instance: Inkscape (vector graphics) and Gimp (bitmaps). You
> can use bitmaps with Inkscape and vector graphics in Gimp, they can
> interoperate quite well, but when you mix both worlds, each program tries
> to convert the alien object to its own point of view. There are graphic
> artists that start a design with Inkscape producing SVG files, and for the
> final product they import a SVG into Gimp. There is also people working
> directly with Gimp from scratch. 
> Saving the distances, it is comparable to the work-flow of some musicians
> starting with a MIDI draft composition and arrangement, and finishing it
> with a digital audio workstation. There are also musicians that work in a
> DAW from scratch. In this analogy, the issues with zoom in images are
> comparable to audio stretch/shrink, and the ability to change tempo or
> scale on the fly is a direct consequence of this. Like the work based on
> arbitrary/symbolic units for measuring time and distances, compared with
> measuring it in pixels and frames/seconds. 

To complete the analogy, we should include computer hardware in the picture. 
When a vector graphics software application shows an image on a computer 
screen, it has rendered the picture using either a software only engine, or an 
accelerated one assisted by a computer graphics card, probably through OpenGL. 
CAD programs use also vector graphics, and their output is usually sent to 
plotters and 3D printers, that don't require bit-mapped renderings before 
materializing the output. 

Electronic musicians have MIDI synthesizers. Soft synths run on computers that 
may render MIDI sequences into digital audio directly. There are also hardware 
synths. In the past, hardware synths were included on almost all computer 
sound cards, sadly not anymore. Creative still ships Audigy cards containing a 
hardware MIDI synth, though. Anyway, many musicians own standalone hardware 
synths, and we are naturally conservative regarding good musical instruments. 
But synthesizers are not the only musical instruments that understand MIDI. 
For instance, the Yamaha Disklavier is a line of acoustic pianos with MIDI 
interfaces that don't synthesize anything. The sound is produced by hammers 
striking strings, as ever. 

There are other use cases for MIDI that don't involve soft synths, or even 
don't involve music at all. What I find laughable is the arrogance of 
pretending that everybody fits a single use case.

Regards,
Pedro
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxaudio.org/pipermail/linux-audio-dev/attachments/20130216/ae98b935/attachment.html>


More information about the Linux-audio-dev mailing list