[LAD] FW: [piksel] The future of videojack ?

Juuso Alasuutari juuso.alasuutari at gmail.com
Tue May 6 21:31:35 UTC 2008

Paul Davis wrote:
> There has never been any disagreement with the idea that videojack is a
> cool idea. What there has been is a widespread feeling in the JACK
> developer community that the video JACK stuff has been done
> suboptimally, in ways that don't reflect a good understanding of JACK's
> design, and without bothering to actually ask us "how should we do
> this?" I'd love to see videojack be a part of mainline JACK, but its not
> going to happen with the current set of patches that have been proposed.
> To be honest, I was suprised that everyone else at the JACK "summit" at
> LAC2008 seemed to agree with me about that.

VideoJACK does sound nice, and it would be a shame to witness forkage
when things could be settled in a more productive manner. I'm in no way
qualified to draft any API plans for JACK, which arguably is the perfect
excuse for doing just that. ;)

I've understood that the current VideoJACK approach is to run two JACK
servers, one for video and one for audio. If it already works with two
completely different servers, wouldn't it also work if audio and video
ports just had different callbacks inside the same server?

The audio callback API wouldn't need to be changed in any way. Video
streaming capabilities would require adding a smallish additional API.
Here's a crude suggestion:

  * Set the video frame rate to num1/num2
jack_set_video_rate (jack_client_t  *client,
                      jack_nframes_t  num1,
                      jack_nframes_t  num2);

  * Set the video processing callback function
jack_set_video_callback (jack_client_t       *client,
                          JackProcessCallback  video_callback,
                          void                *arg);

  * Get the number of frames in a video port buffer
jack_video_get_frame_count (void *port_buffer);

  * Get a frame from a video port buffer
jack_video_get_frame (jack_video_frame_t *frame,
                       void               *port_buffer,
                       jack_nframes_t      event_index);

  * A video frame type
typedef struct _jack_video_frame_t jack_video_frame_t;
struct _jack_video_frame_t
	/* The video frame's timestamp (corresponds to
	   an audio frame index within the same period) */
	jack_nframes_t  time;

	/* The size (in bytes) of the raw frame data */
	size_t          size;

	/* Pointer to raw video frame data. Can be of any format;
	   it is up to the application to manage that stuff */
	void           *raw_data_ptr;

What is stopping us from adding an API layer such as the one outlined
above? It would seem that separate callbacks for audio and video would
serve the VideoJACK crew's needs. Is there something I'm not seeing here?


More information about the Linux-audio-dev mailing list