I don't have any disk I/O operations, I receive data from a USB dongle at a
constant rate.
If I have 'N' clients running, then the time I can spend on each callback
for processing is
= (audio_buffer_period - ( process_time * N ) )
Is that correct ?
-ben
________________________________
From: Paul Davis <paul(a)linuxaudiosystems.com>
Sent: Thursday, September 14, 2017 11:10 PM
To: benravin
Cc: linux-audio-dev(a)lists.linuxaudio.org
Subject: Re: [LAD] Jack buffer requirements
JACK has no requirements other than that you can run your process() callback without
blocking, every time.
How much buffering needs to exist to make sure that can happen depends hugely on what the
non-RT part of things is doing. For comparison, when the non-RT part does disk i/o, you
need to be ready for potentially several seconds of delay in refilling (or emptying)
buffers. If the disk i/o wasn't there, the buffering requirements would be much
smaller.
On Thu, Sep 14, 2017 at 1:10 PM, benravin
<ben.alex@outlook.com<mailto:ben.alex@outlook.com>> wrote:
I want to know the optimal buffering which i can use for designing my
application.
My use case is as follows, I receive digital radio signals through a tuner
and does the channel and audio decoding in separate threads.
Finally the audio is send to jack callback and played out.
How much of buffering is enough for real time streaming between threads.
I want keep the optimal buffering between these threads.
Please suggest guidelines for using with Jack.
-ben
--
Sent from:
http://linux-audio.4202.n7.nabble.com/linux-audio-dev-f58952.html
_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@lists.linuxaudio.org<mailto:Linux-audio-dev@lists.linuxaudio.org>
https://lists.linuxaudio.org/listinfo/linux-audio-dev