Helo all,
(1) I'm new to jack (but a competent programmer). It appears that the
cvs repository is missing the top-level configuration file. Following
the instructions from the download page, I do a clean cvs checkout on
a Linux machine, cd to the jack directory, and execute
./autogen.sh
This fails saying
/usr/bin/m4: configure.in: No such file or directory
configure.ac: 218: required file `config/ltmain.sh' not found
Am I missing something?
(2) My actual interest is in looking into the jack-udp protocol, and
I tried getting this all to work on a Mac a few days ago and actually
got much further, but it looks like file recv.c has problems since it
doesn't appear to include any definition if the jackudp_t data type
that it tries to declare in the first line of code; i.e., I get,
/Users/stp/Code/jack/jack.udp/recv.c:14: error: 'jackudp_t'
undeclared (first use in this function)
Does jack.udp work in general?
Is there any documentation of the actual protocol used?
Has anyone thought of using TCP instead of UDP?
(3) I know that there's a binary distribution of the Mac OSX release,
and have it installed; does anyone know where there's a source
distribution?
stp
...any assistance greatly appreciated...
--
Stephen Travis Pope -- http://create.ucsb.edu/~stp
Center for Research in Electronic Art Technology, University of
California, Santa Barbara
Really—I don't know what the meaning or purpose of life is.
But it looks exactly as if something were meant by it. — C.
G. Jung
Hi,
On Sunday 25 Sep 2005 05:43, Stephen Travis Pope wrote:
> Netjack sounds interesting, but the SourceForge site has no code and
> no forum postings.
> Perhaps this is a case of starting a project by claiming the spot on
> SourceForge...
The code is there, but it's only in cvs, there are unfortunately no releases
made on that site yet (says a little about the lack of maturity).
As for the technical merits I see that the discussion has been more about the
transport mechanism and this project hasn't really touched on that subject
yet. To keep it simple the data is sent via udp with floats directly.
For lowlatency, CPU-hungry, work the interesting bit is that there is a master
jack-server that drive all slave jack-servers, thus working around the sync
issues.
A drawback of this is that its only (easily) allowed to connect an actual
soundcard to the master jack-server, the others are best suited for various
processing tasks (Outboard effects, softsynths).
Regards,
Robert
>
> stp
>
> --
> Stephen Travis Pope -- http://create.ucsb.edu/~stp
> Center for Research in Electronic Art Technology, University of
> California, Santa Barbara
> Really—I don't know what the meaning or purpose of life is.
> But it looks exactly as if something were meant by it. — C.
> G. Jung
>
> Begin forwarded message:
> > From: Robert Jonsson <rj(a)spamatica.se>
> > Date: September 23, 2005 1:23:28 PM PDT
> > To: linux-audio-dev(a)music.columbia.edu
> > Cc: Stephen Travis Pope <stp(a)create.ucsb.edu>, jackit-
> > devel(a)lists.sourceforge.net
> > Subject: Re: [linux-audio-dev] Re: [Jackit-devel] (1) Jack --
> > busted? (2) jack.udp -- busted? (3) jack-osx -- binary-only?
> >
> >
> > Hi,
> >
> > On Friday 23 Sep 2005 21:35, Eric Dantan Rzewnicki wrote:
> >> On Fri, Sep 23, 2005 at 12:10:15PM -0700, Stephen Travis Pope wrote:
> >>> (2) My actual interest is in looking into the jack-udp protocol, and
> >>> I tried getting this all to work on a Mac a few days ago and
> >>> actually
> >>> got much further, but it looks like file recv.c has problems
> >>> since it
> >>> doesn't appear to include any definition if the jackudp_t data type
> >>> that it tries to declare in the first line of code; i.e., I get,
> >>>
> >>> /Users/stp/Code/jack/jack.udp/recv.c:14: error: 'jackudp_t'
> >>> undeclared (first use in this function)
> >>>
> >>> Does jack.udp work in general?
> >>> Is there any documentation of the actual protocol used?
> >>> Has anyone thought of using TCP instead of UDP?
> >>
> >> Alban Peignier uses Rivendell with jack for radio broadcasts. As I
> >> understand it he uses jack.udp to send the audio from the
> >> broadcast box
> >> to a separate box that does encoding for web streaming. So, there
> >> is at
> >> least one working use case.
> >
> > Distributing jack interests me, but sadly I have no time to dive
> > deeper at the
> > moment.
> > Jackudp had a sibling called udpsync (not sure about the source
> > heritage)
> > which, instead of connecting two jacks through the client-interface
> > (which is
> > prone to sync problems) drives the "slave" through the backend. It
> > does
> > work, but is by no means finished.
> > Don't know if it's of interest, the latest sources are here anyway:
> > http://sourceforge.net/projects/netjack
> >
> >
> > Regards,
> > Robert
> >
> > --
> > http://spamatica.se/musicsite/
--
http://spamatica.se/musicsite/
Hi all!
I'm trying to design a library, which should provide "graphical objects" for
the console. Meaning standardised buttons, checkboxes, sliders, etc. The
purpose of this project is: to bring gui-based audio-software to the console.
What I'm wondering now is the following:
It would still be a certain amount of work, to program a textbased ui with
this library. So which way of using this library would be best?
Could any of you imagine good ways to communicate objects of your software
to a kind of server. Something like OSC? So this server can run and you start
the program on the gui and it tells the server: I have th following objects.
Could this be a way of designing it? Would this be sensible? Or would it be
better to only provide a c++ API?
Good thoughts and ideas are wellcome.
Kindest regards
Julien
--------
Music was my first love and it will be my last (John Miles)
======== FIND MY WEB-PROJECT AT: ========
http://ltsb.sourceforge.net - the Linux TextBased Studio guide
On Sep 24, 2005, at 9:02 AM, linux-audio-dev-
request(a)music.columbia.edu wrote:
> Is anyone interested in collaborating on a common sample streaming
> protocol (possibly based on a somewhat simplified version of SDIF or
> the SC3 protocol)?
I'd recommend using RTP as your base protocol, and defining your
SPIF or SC3-like payload as an RTP payload format. You'll pick up
the entire IETF multimedia protocols for free this way, including RTP
MIDI:
http://www.cs.berkeley.edu/~lazzaro/rtpmidi/index.html
I think when it comes to networking, the writing is on the wall when
in comes to packet loss being a part of the environment you need
to live in. Most new purchases of computers are for laptop computers,
most of those users want to use WiFi as their network, and the Internet
layer sees 1-2% packet loss on WiFi. Also, we live in an era where
people want to run LAN apps on the WAN and WAN apps on the LAN,
and packet loss is also an unavoidable part of the WAN Internet
experience.
Finally, modern applications want to use link-local Internet multicast.
RTP was built for letting payload formats handle packet losses in
a way that makes sense for the media type -- RTP MIDI is an extreme
example of this, but the audio and video payload formats are loss
tolerant in more subtle ways. RTP is also multi-cast compatible.
Finally, with RTP there's a standardized way to use RTSP and SIP
for your session management if you wish, or if you prefer, you can
just build RTP into whatever session manager you have committed
to (like jack).
---
John Lazzaro
http://www.cs.berkeley.edu/~lazzaro
lazzaro [at] cs [dot] berkeley [dot] edu
---
Paul Davis
>On Tue, 2005-09-20 at 08:29 +0300, Aaron wrote:
>> please save me from another lisp/scheme scriptable application....
>> > The scripting should be in a language easy enough for a non programer to
>> use.
>>
>> Is xsl a possibility or is there a scripting language that is easier
>> than lisp/scheme?
>
>breathe deeply. think of snakes. say "python".
Are you serious? Do you know python? I hope not...
I don`t want to start a flame-war over programming languages,
but I know both scheme and python very well, and would
never consider python as an extension language again.
--
Slat 0.3 is now finished.
Windows size can be set with -s
Notes are linearly spaced (to prevent the headf**k)
Flat/sharp notes are darkened
http://blog.dis-dot-dat.net/2005/09/slat-03.html
--
"I'd crawl over an acre of 'Visual This++' and 'Integrated Development
That' to get to gcc, Emacs, and gdb. Thank you."
(By Vance Petree, Virginia Power)
Hi!
I am currently working on digesting the USB-MIDI Class Definition ...
http://www.usb.org/developers/devclass_docs/midi10.pdf
As I understand, you can have up to 16 USB MidiStreams (MS), each equal
to a physical midi-cable (and each "cable" having 16 virtual
midi-channels.) There is a bandwith limit of one 3-byte midievent/ms
which makes sense given the bandwith of of a physical midi-cable
The MIDI-USB device also have a control channel without any endpoints
(without any physical midi-jacks.) Again only as far as I have
understood; the control channel is not a MidiStream and should therefore
be able to accept a significantly higher transfer rate than the physical
MidiStreams.
Question: How do I determine the max transfer rate for the control
channel (bInterFaceSubClass == 1) as apposed to the physical midi-outs
(bInterFaceSubClass == 3) ?+
This is for setting LEDs partially lid for more than one parameter, by
pulsewidth-modulation over USB-MIDI.
mvh // Jens M Andreasen
When trying to compile rosegarden, I am getting the following error
during the .configure:
checking if UIC has KDE plugins available... no
configure: error: you need to install kdelibs first.
I am running FC4 x86-64 with gcc4. Any idea? Has anyone encountered
the sam problem?
I think that it may be that redhat compiled QT without the -threads
option.
Hello, I am trying to figure out what my Live 512 & alsa are capable of.
I have been trying to compare what I am seeing in the system with what I
have been able to find in literature and posts. I was wondering if
anyone might be able to offer any clarification. First of all, I am
trying to figure out what the different I/O I see are. In
/etc/asound/devices I see:
4: [0- 0]: hardware dependent
8: [0- 0]: raw midi
19: [0- 3]: digital audio playback This is digital mixer output?
18: [0- 2]: digital audio playback This is Synth or FX output?
26: [0- 2]: digital audio capture FX capture?
25: [0- 1]: digital audio capture Device 1?
16: [0- 0]: digital audio playback Is this the codec playback?
24: [0- 0]: digital audio capture Is this the codec capture?
0: [0- 0]: ctl
1: : sequencer
6: [0- 2]: hardware dependent
9: [0- 1]: raw midi
10: [0- 2]: raw midi Which midi devices are which?
What do the different numbers represent? subdevice: [card- device]: ?
This seems a bit reasonable but kind of contradicts the output I get
from aplay -l:
**** List of PLAYBACK Hardware Devices ****
card 0: Live [SB Live [Unknown]], device 0: emu10k1 [ADC
Capture/Standard PCM Playback]
Subdevices: 32/32
Subdevice #0: subdevice #0
...
Subdevice #31: subdevice #31
card 0: Live [SB Live [Unknown]], device 2: emu10k1 efx [Multichannel
Capture/PT Playback]
Subdevices: 8/8
Subdevice #0: subdevice #0
...
Subdevice #7: subdevice #7
card 0: Live [SB Live [Unknown]], device 3: emu10k1 [Multichannel Playback]
Subdevices: 1/1
Subdevice #0: subdevice #0
Here it would seem that device 2 does not have a subdevice 26 as
suggested by my (perhaps wrong) interpretation of /proc/asound/devices
Thanks. -Garett