Is anyone out there using the ocaml bjack library?
I am going to be working on a program that interprets an svg file as a
synthesis graph, with just the basic puredata objects to start with.
I will be using inkscape as the editor for the synthesis graph
(polygons are processing units, diagram connectors are signal paths).
It seems like, with the higher level processing I will be doing on the
input .svg xml file, ocaml would be an interesting language to use for
this.
Has anyone out there used the bjack library for ocaml, and if so would
you happen to have some example code I could look at, like maybe a
program that just passes its input to its output? Google is not
helping me much, mostly getting stuff related to the debian/ubuntu
packages, and not the use of the library.
Hi
> On Wednesday 08 July 2009 17:43:47 Juhana Sadeharju wrote:
> > Why Linux does not support USB2 audio devices?
> > Where is the problem? (How can I help?)
>
> Repeating the archive: For a rather long time, there was no usb2 audio
> standard. Each device had its own proprietary protocoll only the vendors know.
> Hard to support in linux. Now there is a standard. But no device actually uses
> it because they all already have their own protocols. So in the end nothing
> has changed.
>
> And except from making the vendors publish their specs, there is nothing you
> can do.
This is of course very close to the problems we had with firewire devices:
there were published standards but they weren't always used. The difference
here was that some vendors *did* adopt the standard which made things a bit
easier. Even so, most still utilised vendor-specific extensions (especially
for device control).
AFAIK FFADO got started through the cooperation of one or two vendors and as
momentum has built additional vendors have come on board to a greater or
lesser extent. I expect something similar will have to happen for
professional USB2 audio devices to be generally supported under Linux.
Of course the other way around the problem is to do protocol analysis, but
this really is the "last resort". However, in many ways this is harder on
USB because as I understand it you can't have two USB bus hosts (ie: two
PCs) connected to the same bus. You can probably do it using virtualisation
so long as the virtualiser supports USB passthrough. The kernel's usbmon
facility may then give access to the traffic as it flows. For this to work
you'll need a pretty fast PC since you'll be running audio applications
under a virtualised environment - I'm not sure the required timing can be
met even on today's fastest hardware. In any case, there will be a lot of
work involved in getting a start on this if vendor documentation is not
forthcoming.
I should add that from my point of view, drivers written without vendor
support are really to assist those moving to Linux from other platforms who
already have considerable investment in interfaces which they cannot or will
not replace just to switch platforms. If considering the purchase of an
interface specifically for use under Linux I would encourage you to purchase
from a vendor who actively supports Linux - thereby giving them a reason to
keep doing so.
Regards
jonathan
Hi Chris,
> > " 3.4 ?Install any non side-by-side shared files to the correct
> locations
> I've only just got around to reading this properly, but it seems to be
> saying the opposite of what you said -- it seems to be saying that
> components that are shared across multiple vendors (such as audio
> plugins of this type, presumably) may indeed be placed in the system
> directory. Am I reading it wrong?
"to ensure backward compatibility"
i.e. The old way was to write dlls to the system directory. This is the
cause of 'dll hell' - updating one application breaks another. Windows now
bends over backward to avoid this.
The new way is 'side-by-side' system components where each version gets
it's own directory. Virtualization ensures each application uses the version
is was originally written with.
Side-by-side doesn't apply to application plugins I think, more intended for
operating system components.
Shared plugins go in "../Program Files/Common Files/"
Best Regards,
Jeff
Why Linux does not support USB2 audio devices?
Where is the problem? (How can I help?)
USB2 digital TV device seems to output 10 GB in one hour.
That would match 28 channels of audio.
Juhana
Hello
LADSPA, LV2 and DSSI specifications do not reccomend default discovery
paths where hosts may look for plugins, which IMO is bad for the end
user because:
* end users shouldn't be asked to know about path environment
variables by reading documentation in header files or looking for it
on the web;
* when their environment path variables are not set, some hosts choose
to scan some hard-coded directories, but this changes among hosts and
gets a lot worse on non UNIX-like operating systems (read Windows) -
for example some might look inside some subdirectory of the current
user's home, others might only look in /usr/lib/<api>, etc.
Hereby I propose some default paths which could be used, in the hope
that API authors lurking around here might want to recommend them and
host authors might want to use them:
LADSPA
Unix-like OSes with FHS/Unix-like filesystem layout:
/usr/lib/ladspa, /usr/local/lib/ladspa, ~/.ladspa
Windows: %PROGRAMFILES%\LADSPA, %APPDATA%\LADSPA
Mac OS X: /Library/Audio/Plug-Ins/LADSPA, ~/Library/Audio/Plug-Ins/LADSPA
DSSI
Unix-like OSes with FHS/Unix-like filesystem layout: /usr/lib/dssi,
/usr/local/lib/dssi, ~/.dssi
Windows: %PROGRAMFILES%\DSSI, %APPDATA%\DSSI
Mac OS X: /Library/Audio/Plug-Ins/DSSI, ~/Library/Audio/Plug-Ins/DSSI
LV2
Unix-like OSes with FHS/Unix-like filesystem layout: /usr/lib/lv2,
/usr/local/lib/lv2, ~/.lv2
Windows: %PROGRAMFILES%\LV2, %APPDATA%\LV2
Mac OS X: /Library/Audio/Plug-Ins/LV2, ~/Library/Audio/Plug-Ins/LV2
Plus, we could do the same for LRDF data:
Unix-like OSes with FHS/Unix-like filesystem layout:
/usr/share/ladspa/rdf, /usr/local/share/ladspa/rdf, ~/.ladspa/rdf
Windows: %PROGRAMFILES%\LADSPA\rdf, %APPDATA%\LADSPA\rdf
Mac OS X: /Library/Audio/Plug-Ins/LADSPA/rdf,
~/Library/Audio/Plug-Ins/LADSPA/rdf
What do you think about it?
Stefano
> From: Chris Cannam
> On Thu, Jun 25, 2009 at 9:13 PM, Jeff McClintock<jef(a)synthedit.com>
> wrote:
> > Windows has official rules for this. ?Users are no longer allowed to
> add
> > random files to an application's directory in "/Program
> Files/Appname".
>
> Oh! This is news to me -- interesting news too, given that I
> distribute Windows versions of SV without an installer and just expect
> the user to copy it to %ProgramFiles% if they want it to go there, and
> that it only looks in immediate subdirectories of %ProgramFiles% for
> plugins of any sort.
>
> I don't recall anyone complaining to me that they couldn't install
> plugins for it -- maybe this just means nobody is using it?!
On Windows XP, many people run as administrator, so they won't notice
anything. On Vista and Windows 7, they will get a 'UAC' Warning that they
are doing something restricted.
> Can you point to any documentation for this? I'd like to know what
> other rules I might be falling afoul of.
http://msdn.microsoft.com/en-us/library/ms995843.aspx
(most of that is about system dlls and versioning). The relevant part for
plugins is...
" 3.4 Install any non side-by-side shared files to the correct locations
The proper location for shared components depends on whether these
components are shared across companies or by a single company.
Shared components that are private to a single software vendor must be
installed in one of two places. Do not store these files in the system
directory
common files directory\<company name>
%ProgramFiles%\<company name>\Shared Files
The common files directory can be accessed by passing
CSIDL_PROGRAM_FILES_COMMON to the SHGetFolderPath API."
> typically
> > "C:\Program Files\Common Files\LADSPA Plugins..."
>
> I don't suppose you happen to know whether any Windows-based LADSPA
> hosts are actually using this path?
I don't know about LADSPA..but the latest Cubase uses "C:\Program
Files\Common Files\Steinberg\VST2" for 3rd party plugins.
Personally I think the company name could be omitted because many other
individuals and companies need to put plugins in there too.
Jeff M
Hello Developers,
I am a newbie to the Linux development environment.
We are porting Linux 2.6.25.4 to a platform which has ARM11 as its core
processor. My job is to implement audio device driver.
Most of the docs are meant for PC audio which has sound cards on PCI.
And the source code resides in sound folder under the kernel root.
Since mine is an embedded platform, where will my architecture specific
code resides?
Which all folders under sound folder will be used for embedded devices
and which APIs to look into?
Which APIs will be called for system sounds ?
Can anybody share their experience in implementing audio device driver
on embedded platforms?
Thanks a lot in advance.
Regards,
Pankaj
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi Everyone,
I've released a new version of the Invada LV2 plugins. Major changes are:
* Added new plugin: 'Delay - Munge'.
A delay with non-linear response in the feedback loop. The 'munge' affect is
more noticeable the higher the feedback. Also features a LFO and delay calculator.
Screenshot:
http://www.invadarecords.com/images/downloads/Screenshot-Invada_Delay_Munge…
* Added new plugin: 'Test Tones'
This was more for myself but others may appreciate this. Just a simple sine
oscillator but via the custom GUI the frequency can be set to well known
calibration and musical frequencies. Useful if you need to fault find, calibrate
analogue equipment or if desperate to tune an instrument when you have no tuner
handy.
* Eliminated 'zipping' from changing controls.
All plugins now handle parameter changes (across the entire range) without
producing any audio artefacts ('zipping' noises). No issues with automation of
controls.
* Filters now adjustable from the display
Adjust filters by dragging the response curve about one the display within the
filter plugins.
* Some RDF updates
+ ladspa URN as per
http://lists.lv2plug.in/pipermail/devel-lv2plug.in/2009-June/000226.html
+ Added extended port properties to rdf (log)
* Numerous tweaks/improvements.
People using lv2rack version 4 should upgrade to version 5 before using this
release as I've backed out a workaround from version 1.0.1 which was done just
to support version 4 of lv2rack.
Download is here: http://www.invadarecords.com/Downloads.php?ID=00000264
Ubuntu packages here: https://launchpad.net/~invada/+archive/ppa
cheers,
Fraser
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFKUJanNZroiEh4erwRAr6ZAJ9CDgBrY7Q4ENIH3ymrmma7YJym6QCfc08b
17B/ZGMUvHgedRZ6CnMHK1g=
=JCt2
-----END PGP SIGNATURE-----
Hello to all,
I would like to write a program which does the following:
detect the input events generated by a usb device which i own - the Rig
Kontrol 2 from Native Instruments which is a pedal board intended for
guitar use and which already has a linux driver - and trigger with those
midi messages; i.e, pushing a button would create a midi note on/off,
rolling the pedal a midi CC. As said, the pedal produces already linux
input events so it would be a matter of creating a midi client (jack or
alsa), grab these events and create midi from them. I would then use
these midi messages to control Jack applications (mostly rakarrack)
My C/programming knowledge is basic, so I ask your advice:
1) would it be easier to achieve this through jack or alsa? I'm confused
here because jack applications i use - hydrogen for example - for what
concerns midi, they all appear under the ALSA tab in qjackctl's
Connection window. Are they really alsa midi clients being wrapped by
jack? Is there a reason for writing alsa midi clients instead of native
jack midi clients?
So my problem is if i should learn the jack api or the alsa one.
2) Once I've decided on 1), which is the best place to document myself?
I would prefer, if possible, to learn only what i need for this simple
project - which for me btw is anything but simple. I mean not learning
the whole jack or alsa api if i'm only going to use the basics.
I really want to do this, but as you can see it's a difficult project
for me. It would be wonderful if someone could give me kind of detailed
advice on what "bricks" i'll need: relevant functions from the apis,
general code structure and well, anything that will make me life
easier :)
thank you,
Renato