Hello, I have been googleing around trying to find information on raw
pcm data. Does anybody know of any tutorials or references on raw pcm
data? I am most curious about different storage types (2s complement,
etc), and how multiple channels are stored. Thanks. -Garett
dssi-vst is a DSSI wrapper plugin for VST plugins. It enables any
compliant DSSI host to use VST instruments and effects. It requires
Wine, liblo-0.9, dssi.h, and the Steinberg VST SDK headers to build.
http://sf.net/projects/dssi/http://www.winehq.org/http://plugin.org.uk/liblo/http://www.steinberg.net/steinberg/ygrabit/index.html
dssi-vst is self-contained code, it doesn't use vstserver or libfst.
There's no very compelling reason for that, and there's nothing
inventive about it, but it works quite well for me with all two of
the existing DSSI hosts.
dssi-vst is licenced under the GPL with an additional exemption to
cover the non-redistributability of the Steinberg SDK headers. This
means it is technically not Free Software in the Debian/FSF sense.
Again, see the README for more details.
Chris
liblo is an implementation of the Open Sound Control[1] protocol for POSIX
systems. It is released under the GPL. It is written in ANSI C.
http://plugin.org.uk/liblo/
This release adds: (over the last stable release, 0.5)
* Nonblocking mesage dispatcher (useful for single GUI-thread
applications, eg Qt, GTK+)
* bugfixes to URL handling (0.6 - 0.8 broke DSSI)
* UNIX domain (FIFO) socket server/client support
* TCP domain socket server/client support
* A method to stop server threads
* Better regression tests (testlo)
* Fixed memory leaks
* More documentation
* Dynamic library building fixes
- Steve
[1] http://www.cnmat.berkeley.edu/OpenSoundControl/
Quoting Paul Davis <paul(a)linuxaudiosystems.com>:
> wrong model. a given jackd has a single driver. a new jack client,
> sure.
I believe the way to do this is to have one remote jackd with a driver
that sends/receives data through UDP and one local jack client that
interacts with this remote server.
There is something like this already, I believe (haven't checked):
http://www.alphalink.com.au/~rd/m/jack.udp.html
As a side note, the system I developed intends to do this over ladspa;
more on this on another message.
> oh, and a small correction. VST System Link has basically nothing to
> do with networked audio. [...] it does *not* distribute audio
> across the network at all.
If I understood it correctly, yes it does, but their concept of a
"network" is somewhat weird: it allows you to send data from one
machine to the other for remote processing, but it uses the
digital audio ports of the soundcards of the machines to pass
the audio data around (together with other data). Therefore, it
only works with audio cards with digital I/O and word clock.
See ya,
Nelson
----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.
On Aug 18, 2004, at 12:38 PM,
linux-audio-dev-request(a)music.columbia.edu wrote:
>> -- There are tools for synchronization (RTCP mappings of NTP
>> and RTP timestamps), tools for security (SRTP), tools for
>> all sorts of things someone might need to do someday.
>
> this does seem very useful. there's no way to transport time between 2
> jackd instances right now, and it would be wise to reuse existing
> technology whenever this is added. otoh, it has to be bit more
> extensive since we need music and/or smpte time rather than just
> wallclock time.
One way to do this is to have a multi-stream session, with one of
the sessions being RTP MIDI, that uses System Exclusive commands
to do MTC, or MIDI sequencer, or MMC to do your timing. So,
this would recreate the current hardwired world, but using RTP MIDI
to do pseudo-wire emulation of the MIDI cable carrying MTC ... the
RTP RTCP stream would have NTP, as would all the audio RTP streams,
and the receiver uses these common NTP timestamps to derive
cross-sync between the MTC sync information in RTP MIDI and the RTP
timestamps on the audio stream.
Of course, this only works as well as your NTP sync ... in an ideal
world,
a single server generates these streams off of a single NTP clock, or
at least you have a very good NTP daemon keeping things in sync.
Roger Dannenberg gave a good talk on these issues at the OSC-fest
here at Berkeley last month ...
---
John Lazzaro
http://www.cs.berkeley.edu/~lazzaro
lazzaro [at] cs [dot] berkeley [dot] edu
---
This is along the same lines as the recent question about which API to
use for sound (to which I gave a poor answer; I repent!). What are the
options for doing MIDI? Is it best to use the ALSA library API, or is
there something better?
--
.O. Hans Fugal | De gustibus non disputandum est.
..O http://hans.fugal.net | Debian, vim, mutt, ruby, text, gpg
OOO | WindowMaker, gaim, UTF-8, RISC, JS Bach
---------------------------------------------------------------------
GnuPG Fingerprint: 6940 87C5 6610 567F 1E95 CB5E FC98 E8CD E0AA D460
I'm pleased to announce the beta release of Rivendell v0.9.0. Rivendell is a
complete, GPLed radio broadcast automation solution, with facilities for the
acquisition, management, scheduling and playout of audio content. Further
information, screenshots and download links can be found at:
http://www.salemradiolabs.com/rivendell/
This release marks a significant milestone in the project in that Rivendell is
now feature-complete, with all functionality required for basic broadcast
operation. Some of the new features of this beta release include:
1) Binary RPM support for SuSE 9.1.
2) An automatic log generation and import utility, RDLogManager.
RDLogManager supports integration with virtually all third-party traffic and
music scheduler packages in use today.
3) Enhanced support for the JACK Audio Connection Kit, allowing reliable
operation with standard 'garden variety' sound cards using the Linux ALSA
driver.
4) Improved stability and useability in virtually all components.
Cheers!
|-------------------------------------------------------------------------|
| Frederick F. Gleason, Jr. | Director of Broadcast Software Development |
| | Salem Radio Labs |
|-------------------------------------------------------------------------|
| All great ideas are controversial, or have been at one time. |
| -- Anonymous |
|-------------------------------------------------------------------------|
Quoting Dave Robillard <drobilla(a)connect.carleton.ca>:
> I still say networked audio belongs in jack, not a plugin.
It belongs in both:
- If you want to use the network to increase your total processing
power, you probably just want to offload some plugins to a remote
machine. Sure, you may run jack-rack remotely and send/receive
the data through a jack client locally, but this may become a
mess in some situations if you need the plugin to be run "inside"
an app. It also prevents you from using any kind of plugin
automation. Therefore, is such cases it is more interesting to
use ladspa for the distributed processing.
- OTOH, if you need something more complicated to be done remotely
(like disk I/O or running a complete instance of ardour), using
jack (as both me and Steve Harris described in other messages)
may be better.
> I guess a VST solution existing would suggest otherwise
VST System Links works on top of ASIO, not VST. Don't know about
FX-Teleport.
> I think it's a stupid idea well outside the realm of plugins (especially
> LADSPA, which shouldn't be sending things over a network)
It is a hack, but a useful one.
> your patches are going to have to be specifically
> set up to be sending crap over the network (ie they'll have to have a
> special plugin loaded, and all the audio going into that).
And why is that a problem? You just load a special plugin that appears
to be the plugin you intended. The app can't tell the difference.
> If it was Jack, everything in every app could be set up just as usual,
> and it would be sent over the network by jack, unbeknownst to the app.
In my system, everything is completely transparent to the app.
See ya,
Nelson
----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.
On Aug 18, 2004, at 2:15 AM, Paul Davis wrote:
> and in fact, jlc and i have done some tentative experiments with
> *live network audio* using jackd and ices w/jack support using only
> our DSL connectivity. the model that ices uses is more or less
> perfect, i think. just a client with some ports that happen to send
> audio to the rest of the world. only the user knows that, other apps
> just think its a regular client. jack doesn't care either, so everyone
> who has a different idea of how to actually connect the endpoints can
> do their own thing and everyone can coexist.
I'd really suggest considering the pros of integrating IETF tools
(SIP, RTSP, RTP) into this scheme. You could use still use jack
as your application later, but instead of engineering your own
transport layers for session management (SIP, RTSP) and media
(RTP), you'd use IETF protocols -- just like you use TCP instead of
re-inventing it for each app that needs a reliable bytestream.
We're seeing the IETF stack used this way more and more in the
commercial world -- the wireless audio servers (Apple Airport
Express, etc) use RTSP and RTP.
Good reasons to do this:
-- You may think you're trying to solve a small well-defined problem,
but if Jack is a success, people are going to extend it to work in
all sorts of domains. The IETF toolset has been stretched in lots
of ways by now -- interactive and content-streaming, unicast and
multicast, LAN and WAN, lossy and lossless networks, etc -- and
its known to adapt well. Traditional go-it-alone companies, like
Apple,
use it all over the place -- iChat AV and Quicktime both use RTP,
iChat AV uses SIP, Quicktime uses RTSP.
-- Modern live-on-stage applications use video, and RTP has a
collection of video codecs ready to go. Ditto for whatever other
sort of uncompressed or compressed media flow you need.
-- There are tools for synchronization (RTCP mappings of NTP
and RTP timestamps), tools for security (SRTP), tools for
all sorts of things someone might need to do someday.
-- The IPR situation is relatively transparent -- you can go to the
IETF
website and look at IPR filings people have made on each
protocol, and at least see the non-submarine IPR of the people
who actually developed the protocols -- you can't be a WG member
and keep submarine patents away from the IETF.
-- Most of the smart people who work on media networking in all of
its forms do not subscribe to LAD. The easiest way to tap into
their
knowledge is to use their protocols. And likewise, the smart
people
here can take their results and turn them into standards-track
IETF
working group items, and help make all media apps work better.
---
John Lazzaro
http://www.cs.berkeley.edu/~lazzaro
lazzaro [at] cs [dot] berkeley [dot] edu
---
Hi,
Jackbeat 0.3.0 is available at : http://www.xung.org/jackbeat
This is a development release of my little-but-yet-flexible JACK sequencer.
Drummachine-like, shrinkable, scalable, unstable (oops... did I write that? :-)
I rewrote almost everything. This is mainly focused at providing pattern nesting
("buses", somehow), better IPC implementation, and an elegant song file
format based on xml and tar, as discussed in here, on LAD.
In the above statement, please do not miss "focused"... that is : the TODO
file in another major feature of this release.
--
og
"Make music, not war" -- me, today.