Well, this has been discussed to death on the
jack-devel lists. I
can see that from an audio developer's point of view,
it would be nice to have video within the same
>server as audio.
However, there are fundamental differences between video and audio,
which make
this in my mind impractical.
Firstly, there is the problem of latency - for audio,
generally the
aim is to have a latency < 4ms. For video -
since we are dealing with much larger chunks of data,
a >latency an
order of magnitude greater than this is usually acceptible.
Second, the timing is different. For audio generally
you have a 1024
sample buffer and a rate of 44.1KHz or 48KHz. Video requires
usually
something like 25fps.
So you can either have two clocks, or you can split
the video into
chunks (ugh), both solutions have problems.
If you can solve these problems, then there is
absolutely nothing
stopping you running video and audio in the same server
(Video simply adds a new port type). Regards,
Salsaman.
Video in jack1 won't happen because of several reasons that can be
explained again: we want to fix and release jack1 soon and video in
jack is a too big change to be integrated in the current state of the
proposed patch.
The future of jack is now jack2, based on the jackdmp new
implementation (
http://www.grame.fr/~letz/jackdmp.html). A lot of work
has already been done in this code base that is now API equivalent to
jack2. New features are already worked on like the DBUS based control
(developed in the "control" branch) and NetJack rework (developed in
the "network" branch).
I think a combined "video + audio in a unique server" approach is
perfectly possible: this would require having 2 separated graph for
audio and video running at their own rate. Video and audio would be
done in different callbacks and thus handled in different threads
(probably running at 2 different priorities so that audio can
"interrupt" video). Obviously doing that the right way would require a
bit of work, but is probably much easier to design and implement in
jackd2 codebase.
Thus I think a better overall approach to avoid "video jack fork" is
to work in this direction, possibly by implementing video jack with
the "separated server" idea first (since is is easier to implement).
This could be started right away in a jack2 branch.
What do you think?
Stephane