Hi all!
I'm trying to design a library, which should provide "graphical objects" for
the console. Meaning standardised buttons, checkboxes, sliders, etc. The
purpose of this project is: to bring gui-based audio-software to the console.
What I'm wondering now is the following:
It would still be a certain amount of work, to program a textbased ui with
this library. So which way of using this library would be best?
Could any of you imagine good ways to communicate objects of your software
to a kind of server. Something like OSC? So this server can run and you start
the program on the gui and it tells the server: I have th following objects.
Could this be a way of designing it? Would this be sensible? Or would it be
better to only provide a c++ API?
Good thoughts and ideas are wellcome.
Kindest regards
Julien
--------
Music was my first love and it will be my last (John Miles)
======== FIND MY WEB-PROJECT AT: ========
http://ltsb.sourceforge.net - the Linux TextBased Studio guide
On Sep 24, 2005, at 9:02 AM, linux-audio-dev-
request(a)music.columbia.edu wrote:
> Is anyone interested in collaborating on a common sample streaming
> protocol (possibly based on a somewhat simplified version of SDIF or
> the SC3 protocol)?
I'd recommend using RTP as your base protocol, and defining your
SPIF or SC3-like payload as an RTP payload format. You'll pick up
the entire IETF multimedia protocols for free this way, including RTP
MIDI:
http://www.cs.berkeley.edu/~lazzaro/rtpmidi/index.html
I think when it comes to networking, the writing is on the wall when
in comes to packet loss being a part of the environment you need
to live in. Most new purchases of computers are for laptop computers,
most of those users want to use WiFi as their network, and the Internet
layer sees 1-2% packet loss on WiFi. Also, we live in an era where
people want to run LAN apps on the WAN and WAN apps on the LAN,
and packet loss is also an unavoidable part of the WAN Internet
experience.
Finally, modern applications want to use link-local Internet multicast.
RTP was built for letting payload formats handle packet losses in
a way that makes sense for the media type -- RTP MIDI is an extreme
example of this, but the audio and video payload formats are loss
tolerant in more subtle ways. RTP is also multi-cast compatible.
Finally, with RTP there's a standardized way to use RTSP and SIP
for your session management if you wish, or if you prefer, you can
just build RTP into whatever session manager you have committed
to (like jack).
---
John Lazzaro
http://www.cs.berkeley.edu/~lazzaro
lazzaro [at] cs [dot] berkeley [dot] edu
---
Paul Davis
>On Tue, 2005-09-20 at 08:29 +0300, Aaron wrote:
>> please save me from another lisp/scheme scriptable application....
>> > The scripting should be in a language easy enough for a non programer to
>> use.
>>
>> Is xsl a possibility or is there a scripting language that is easier
>> than lisp/scheme?
>
>breathe deeply. think of snakes. say "python".
Are you serious? Do you know python? I hope not...
I don`t want to start a flame-war over programming languages,
but I know both scheme and python very well, and would
never consider python as an extension language again.
--
Slat 0.3 is now finished.
Windows size can be set with -s
Notes are linearly spaced (to prevent the headf**k)
Flat/sharp notes are darkened
http://blog.dis-dot-dat.net/2005/09/slat-03.html
--
"I'd crawl over an acre of 'Visual This++' and 'Integrated Development
That' to get to gcc, Emacs, and gdb. Thank you."
(By Vance Petree, Virginia Power)
Hi!
I am currently working on digesting the USB-MIDI Class Definition ...
http://www.usb.org/developers/devclass_docs/midi10.pdf
As I understand, you can have up to 16 USB MidiStreams (MS), each equal
to a physical midi-cable (and each "cable" having 16 virtual
midi-channels.) There is a bandwith limit of one 3-byte midievent/ms
which makes sense given the bandwith of of a physical midi-cable
The MIDI-USB device also have a control channel without any endpoints
(without any physical midi-jacks.) Again only as far as I have
understood; the control channel is not a MidiStream and should therefore
be able to accept a significantly higher transfer rate than the physical
MidiStreams.
Question: How do I determine the max transfer rate for the control
channel (bInterFaceSubClass == 1) as apposed to the physical midi-outs
(bInterFaceSubClass == 3) ?+
This is for setting LEDs partially lid for more than one parameter, by
pulsewidth-modulation over USB-MIDI.
mvh // Jens M Andreasen
When trying to compile rosegarden, I am getting the following error
during the .configure:
checking if UIC has KDE plugins available... no
configure: error: you need to install kdelibs first.
I am running FC4 x86-64 with gcc4. Any idea? Has anyone encountered
the sam problem?
I think that it may be that redhat compiled QT without the -threads
option.
Hello, I am trying to figure out what my Live 512 & alsa are capable of.
I have been trying to compare what I am seeing in the system with what I
have been able to find in literature and posts. I was wondering if
anyone might be able to offer any clarification. First of all, I am
trying to figure out what the different I/O I see are. In
/etc/asound/devices I see:
4: [0- 0]: hardware dependent
8: [0- 0]: raw midi
19: [0- 3]: digital audio playback This is digital mixer output?
18: [0- 2]: digital audio playback This is Synth or FX output?
26: [0- 2]: digital audio capture FX capture?
25: [0- 1]: digital audio capture Device 1?
16: [0- 0]: digital audio playback Is this the codec playback?
24: [0- 0]: digital audio capture Is this the codec capture?
0: [0- 0]: ctl
1: : sequencer
6: [0- 2]: hardware dependent
9: [0- 1]: raw midi
10: [0- 2]: raw midi Which midi devices are which?
What do the different numbers represent? subdevice: [card- device]: ?
This seems a bit reasonable but kind of contradicts the output I get
from aplay -l:
**** List of PLAYBACK Hardware Devices ****
card 0: Live [SB Live [Unknown]], device 0: emu10k1 [ADC
Capture/Standard PCM Playback]
Subdevices: 32/32
Subdevice #0: subdevice #0
...
Subdevice #31: subdevice #31
card 0: Live [SB Live [Unknown]], device 2: emu10k1 efx [Multichannel
Capture/PT Playback]
Subdevices: 8/8
Subdevice #0: subdevice #0
...
Subdevice #7: subdevice #7
card 0: Live [SB Live [Unknown]], device 3: emu10k1 [Multichannel Playback]
Subdevices: 1/1
Subdevice #0: subdevice #0
Here it would seem that device 2 does not have a subdevice 26 as
suggested by my (perhaps wrong) interpretation of /proc/asound/devices
Thanks. -Garett
Hi
I am writing a grant for a project I am doing. The potential funder
requires the archived audio be in AES 31 format.
Is there any application/lib etc on lin that outputs AES31?
Thanks
Aaron
My Turtle Beach Santa Cruz (cs46xx) card has a strange problem:
mic recording in JACK is really distorted.
Records fine (as fine as my cheap mic can) in Audacity.
arecord sounds fine as well.
When I record with TimeMachine in JACK, it sounds terribly distorted
(maybe saturated is the word).
Same mixer settings for all of the above. Has anyone else had luck
recording with this card in JACK?
--
Hans Fugal | If more of us valued food and cheer and
http://hans.fugal.net/ | song above hoarded gold, it would be a
http://gdmxml.fugal.net/ | merrier world.
| -- J.R.R. Tolkien
---------------------------------------------------------------------
GnuPG Fingerprint: 6940 87C5 6610 567F 1E95 CB5E FC98 E8CD E0AA D460