> Yes, the FA-101 works but I can't tell you the numbers for latency.
The site has a nice table of working setups, but no user emails. I'd ask
them directly.
> What do you mean with restristrions on buffers (=latency?)?
AFAIK, usb cards impose 48kHz (96), 48/3 buffer setup to please
everybody in the chain.
Jackd needs buffers to be power of 2, and usb-audio - multiples of 1ms.
> we consider the 192 kHz more as marketing gag :)
That's to compete with Fireface.
> Hope that helps.
Thank you.
Dmitry.
Jens:
> I have noticed that the midi-mixing-console in Rosegarden is not
> listening to external events (like knobs on your fancy midi-controller)
It does in the current CVS version, and that in Studio to Go! 1.50. You have to connect your controller to the external controllers ALSA port, though.
However, I wouldn't recommend the code for this to anyone - it's a brutal hack on a design not intended for this - so I can't add much to your initial point...
Chris
Hi all,
Let me spread the word:) finally there comes this bug and some other
usability fixes on today's latest Qsynth 0.2.4 release.
As you might know already, Qsynth is a fluidsynth GUI front-end
application, written in C++ around the Qt3 toolkit, using Qt Designer.
Please check it out from:
http://qsynth.sourceforge.net
Upgrade is highly recommended as this one fixes a very annoying crash
bug that has been lurking for ages.
As simply pasted from the change-log:
- All widget captions changed to include proper application title prefix.
- Attempt to bring those aging autoconf templates to date; sample SPEC
file for RPM build is now being included and generated at configure time.
- Missing icons on channel and soundfont setup context menus are now up;
bank/program splitter widget added to channel preset dialog.
- An abrupt segfault on engine restart have been finally fixed; this
issue has been quite an annoyance which has been around for ages and was
a highly probable showstopper just when restarting an engine due to
changes on the setup settings. Not anymore, hopefully.
- New tool buttons were added to the main widget, for adding a new
engine and removing the current one, while trying to increase the
visibility of multiple fluidsynth engine capability (for new users, at
least :)
- Set to use QApplication::setMainWidget() instead of registering the
traditional lastWindowClosed() signal to quit() /slot, just to let the
-geometry command line argument have some optional effect on X11.
- Minor configure and Makefile install fixes, as Debian and Mac OS X
specialties. Also, install does the right thing with target file modes
(thanks to Matt Flax and Ebrahim Mayat, for pointing these out).
- Fixed output disability when messages limit option is turned off
(thanks to Wolfgang Woehl for spotting this one, while on qjackctl).
Hope you enjoy.
--
rncbc aka Rui Nuno Capela
rncbc(a)rncbc.org
Hi!
I have noticed that the midi-mixing-console in Rosegarden is not
listening to external events (like knobs on your fancy midi-controller)
I also recall that getting the Mx44 to display current parameters from
selected patch without going into a loop (midi-update-> widget-update->
midi-update-> ...) was a challenge. Most probably I just tried to do it
wrongly :)
Should we perhaps do a (shortish) readme on how to get widgets to stay
in sync with an associted midi-input-stream?
mvh // Jens M Andreasen
> all depends on the definition of "reasonable price".
:)
I meant light laptop. Better less magic 2.2 kg, where most of light but
powerful enough laptops are. Possibly with external CD.
And with price around $1500 in Russia.
Acer TravelMate 3002 looks good, has fast cpu, but shared video mem.
Sony VGN-S480 is cool (nv6200) but $2100. At least it exists.
The thing is, I can't afford 2 laptops - a powerful one and a light one.
> i can do 26x26, 64 frames/interrupt on my RME digiface cardbus with an
> nvidia video
> controller. HP Pavilion zd7000 "desktop replacement"
>
Great! Are you using nvidia binary drivers?
Dmitry.
Hello!
Another set of questions for experienced Linux Audio Users.
Mainly it's related to laptop performace.
It seems the choice of video system for modern laptop consists of two
main alternatives:
1) dedicated high performance controller (nvidia/ati) with closed source
drivers
2) shared memory controller (intel) with open source drivers
People on Windows forums (no choice for Apples) prefer dedicated
controller (with own video memory) because shared memory video degrades
performance and increase latencies (they say, and in windows).
I suppose, under Linux the things are different, because minimal
possible latency is directly related to interrupt processing: closed
source drivers have arbitrary interrupt paths, surely are written to
maximise video performance and thus, should play a bad role in latency.
Moreover they cannot be fixed. Open source ones at least can be fixed.
Or I am completely wrong and shared video memory makes it bad on a
hardware side (locking pci bus, for example)?
So, the question is: what to choose, integrated intel solution or
ati/nvidia one (in this case, nvidia is preferred, because of driver
quality).
Thank you.
Dmitry.
P.S. As a target system, imagine laptop with RME Cardbus.
Hi all,
I'm just about to go on holidays with my family. While I'm away I will
have a little time here and there to meet up with some friendly linux
audio types.
I'm be in the following places on the following dates:
London UK 2-16 Oct
Newcastle UK 16-19 Oct
Copenhagen DK 19-26 Oct
Anyone who would like to meet up for coffee or a beer or something should
email me at erikd AT my usual domain name. I'm up for a chat or being
dragged along to a Linux user group type function.
Cheers,
Erik
--
+-----------------------------------------------------------+
Erik de Castro Lopo
+-----------------------------------------------------------+
"If you think C++ is not overly complicated, just what is a
protected abstract virtual base pure virtual private destructor
and when was the last time you needed one?" -- Tom Cargill
Tim Goetze:
>>>breathe deeply. think of snakes. say "python".
>>
>>Are you serious? Do you know python? I hope not...
>>
>>I don`t want to start a flame-war over programming languages,
>>but I know both scheme and python very well, and would
>>never consider python as an extension language again.
>
>Would you care to back up this judgment by a few facts?
No, I don't. I don't want to start a flame-war... I just say
I know both languages very well. Its hard to explain (but
not impossible though) why scheme is a better language.
If you had known lisp well, you had probably known what I ment.
>One has to consider that a scripting extension for an application like
>ardour, with an intended audience of little to no programming
>knowledge, will have to be as painless and intuitive to learn and use
>as possible and still provide power and freedom of expression to the
>savvy.
In my opinion, scheme is such a language. It takes some time
to learn if you are not used to s-expressions though.
--
There was a question about what kind of accuracy would be required. Well,
the more the better. Without electronic assistance this was done by letting
the clock run for a day and measuring it against a known good time source,
adjusting the pendulum, repeat. Running the clock for longer periods
between measurements obviously makes the adjustment more accurate.
With electronic assistance the same rules apply. Running the clock longer
between measurements makes the adjustment more accurate. Let's do some
math. Assuming we use one pendulum cycle to measure the period and say that
this clock is geared for a one second pendulum cycle. If we can measure
that one second with an accuracy of 100ths of a second, then our average
error will be half that or
0.005 sec / sec or
0.3 sec / min or
0.3 min / hr or
7.2 min / day or
216 minutes over 30 days
If this is a 31 day clock, over 3 hours off is not very good per winding.
If I can get millisecond accuracy, then I'm down to 21.6 minutes over 30
days which is still not very good. Now if I increase my sample size to 10
pendulum cycles and still maintain millisecond accuracy, I can set the clock
within 2.16 minutes lost/gained in 30 days. That would be a good start. It
would only take me 10 seconds to decide to adjust the clock slower or faster
and I would eventually be correct to within .072 minutes per day when I
finished adjusting to the accuracy level of this instrument.
Further adjustments using longer sample times would then be able to increase
the accuracy to the desired level or at least to whatever level of accuracy
the clock works are capable of.
Returning to one of my original questions. Does anyone have any suggestions for
useful libraries or example source code to study as a start for this project?
----- End forwarded message -----