What I want to do, is to use the resources I have to run multiple signal generation and processing chains asynchronously, in parallel, and then use the final audio-hardware-synchronized chain to resample them all into one, perhaps using the Zita tools. Anyone know if this is possible? I saw this flow structure work very well in the video domain, quite a few years ago.
That's not what you want to do at all.
JACK is designed to be a *synchronous* system. All clients process audio corresponding to the same period of time, precisely in sync with each other. You do not want to "resample them all into one", and certainly not with the zita tools.
A "correct" digital audio processing and/or synthesis environment consists of a single audio interface (or at least, a single digital sample ("word") clock). You can run any number of JACK clients, connected in arbitrary ways. But using multiple audio interfaces (which is what the zita tools are related to) is not the right thing to do unless you are forced to by lack of funds or inappropriate hardware.