Tito Latini tito.01beta at gmail.com
Fri Sep 23 09:42:11 UTC 2016

On Thu, Sep 22, 2016 at 04:36:17PM -0500, Paul Davis wrote:
> On Thu, Sep 22, 2016 at 4:27 PM, Tito Latini <tito.01beta at gmail.com> wrote:
> > On Thu, Sep 22, 2016 at 12:49:42PM -0500, Paul Davis wrote:
> > > The innovation is defining an API and protocol based on 3 concepts:
> > >
> > >     tempo synchronization
> >
> > an integral to get the position with the new bpm
> >
> across a network? with multiple tempo masters?

I respectfully think you don't understand the technical problem.

We send/receive informations about the time, the change of the tempo
and/or the position of a beat, the updated number of the participants,
the time window to get the next temporal change, a message "too soon"
to discard or defer a change, the minimal time window, latency, etc,
but not the performance of the change.

That's a protocol.

The integral is software-side, not across the network. It's not always
necessary because it depends on the musical instrument. For example, a
string quartet uses a particular curve to change the tempo; the curve
is possibly different with pizzicato; a drum machine plays in ritardo
or fills the time window with a roll. The modus operandi is arbitrary
for an application, the network is used to share the information about
the clock and the possible changes (start-time, end-time, from-bpm,
to-bpm, from-position, to-beat, etc).

Under the hood, the synchronization of the beat is the change of a
local bpm. For example, tempo-change + beat alignment is a combination
of two integrals (software-side) if the instrument plays accelerando
or ritardando. The coder of the program/library is responsible of the
curve-palette and the optimization.

One common protocol to communicate, many programs/libraries to play.

More information about the Linux-audio-dev mailing list