Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Looking for zero/low latency AUv3 host for midi "stuff"

I'm looking for something that can host midi AU's (like Bram's Scaler for instance) that can be set to a latency of zero samples. AUM, etc. only go as low as 64. As I won't be processing audio, I don't really need an "audio host"

Any chance a beast like this has been bred since midi AU's popped into existence?

Comments

  • Do you mean the buffer size? Even if you're not recording audio, anything being input by the user will have some latency, whether it's triggering midi, audio or visual.

    64 is really fast. Here's a link to an article explaining the options for near-zero latency. It usually involves direct monitoring or monitoring the performance before it is processed.

    https://ask.audio/articles/how-to-achieve-true-zero-latency-monitoring-in-your-daw

  • Doesn't really matter if its MIDI or Audio, both are typically processed the same way. Think of it like this, with a buffer size of 64 samples, the audio thread will wake up every seconds where it needs to provide the next 64 output samples. It's here that both, Audio and MIDI are processed, so even if there's no audio involved the general setup doesn't change. Hence, there's no difference between a MIDI only host and an Audio/MIDI host.

    Lowering the buffer size makes the audio thread wake up more often which helps with latency but also gives the CPU less time to generate the next batch of samples so going lower has it's drawbacks.
    Also, keep in mind that next to the buffer size, the sample rate plays into this as well since with higher rates latency goes down but CPU load goes up.

    Now, like @Thardus said, there will always be some latency and there are different ways to deal with it depending on the setup but something as low as 64 samples should do in most cases.

    Hope that helps.

  • Actually, I have zero latency MIDI (only) AU3 hosting working experimentally in the current development version of MidiFire with some fancy trickery. Won't work with all plugs though (those that are strictly time aligned) but for those that aren't fussy about being called with random render cycles (Stream Byter being one of them) this works quite well.

  • You still need to keep some latency in order to make use of timestamps, otherwise you'll introduce jitter!

  • edited October 2018

    There is no latency as running outside of an audio engine but still satisfying AUBase validation to get render cycles to trigger on demand with each event. This is MIDI FX only and not every plug will work properly. Those that are not fussy about the render cycle parameters and only work with the incoming timestamps are candidates. Like I said, it's experimental but it is working for me with Stream Byter and the Apple MIDI Thru demo at least. But you are right that if the plug is dependent on the render cycle being triggered at the correct time the results are 'interesting' :-)

    Of course not exactly zero latency since the event still has to pass through AU and the plug but it's not tied to the render cycle/buffer size.

  • @audeonic said:
    There is no latency as running outside of an audio engine but still satisfying AUBase validation to get render cycles to trigger on demand with each event. This is MIDI FX only and not every plug will work properly. Those that are not fussy about the render cycle parameters and only work with the incoming timestamps are candidates. Like I said, it's experimental but it is working for me with Stream Byter and the Apple MIDI Thru demo at least. But you are right that if the plug is dependent on the render cycle being triggered at the correct time the results are 'interesting' :-)

    Of course not exactly zero latency since the event still has to pass through AU and the plug but it's not tied to the render cycle/buffer size.

    That sounds very interesting, happy to hear about what you're working at 👍🏼

  • @j_liljedahl said:
    You still need to keep some latency in order to make use of timestamps, otherwise you'll introduce jitter!

    How low could AUM go, without audio, and maintain a stable midi clock?

  • Buffer size effects audio, not midi

  • @ToMess said:
    Buffer size effects audio, not midi

    AU MIDI is handled in the coreaudio thread B)

  • edited October 2018

    @brambos said:

    @ToMess said:
    Buffer size effects audio, not midi

    AU MIDI is handled in the coreaudio thread B)

    true but @ToMess your buffer size affects the amount of midi you can output in the render. I noticed this when I wanted a panic to send 128x midi note offs for all channels - it just doesnt fit in one buffer

  • edited October 2018

    @brambos said:

    @ToMess said:
    Buffer size effects audio, not midi

    AU MIDI is handled in the coreaudio thread B)

    Okay, well that makes no sense xD Why cant coremidi handle midi on ios? Or is this just some workaround to get midi working in AU?

  • @ToMess said:

    @brambos said:

    @ToMess said:
    Buffer size effects audio, not midi

    AU MIDI is handled in the coreaudio thread B)

    Okay, well that makes no sense xD Why cant coremidi handle midi on ios? Or is this just some workaround to get midi working in AU?

    It’s not even a workaround. It’s the only way to ensure MIDI and audio are in sample-accurate sync and processed together on the realtime thread.

  • @brambos said:

    @ToMess said:

    @brambos said:

    @ToMess said:
    Buffer size effects audio, not midi

    AU MIDI is handled in the coreaudio thread B)

    Okay, well that makes no sense xD Why cant coremidi handle midi on ios? Or is this just some workaround to get midi working in AU?

    It’s not even a workaround. It’s the only way to ensure MIDI and audio are in sample-accurate sync and processed together on the realtime thread.

    Okay that makes sense, when the midi is controlling audio inside ipad. But when its just au plugins spitting out midi outside ipad or ipad passing through midi from and to external gear(routed through AUM), is this still the case? I guess making the plugins to understand the difference between their midi controlling synths inside ios and outside of ios, would just mostly be work for nothing that makes a real difference?

  • @ToMess said:
    Okay that makes sense, when the midi is controlling audio inside ipad. But when its just au plugins spitting out midi outside ipad or ipad passing through midi from and to external gear(routed through AUM), is this still the case? I guess making the plugins to understand the difference between their midi controlling synths inside ios and outside of ios, would just mostly be work for nothing that makes a real difference?

    Same thing. Since, iOS devices aren't realtime (unlike dedicated dsp hardware) the audio thread is the most stable timing source you get and like mentioned before, the only way to ensure sample accurate timing so using something else in most cases is just asking for trouble.

  • @MonkeyDrummer
    This has proven to be quite an enlightening thread, however I must confess I am now really curious what is your end goal?

    I initially assumed it to be drum trigger related, but have always heard
    -/+5 sample rate is essentially indiscernable for any practical application.

    So now my thought is perhaps a long series of midi fx, perhaps 100 instances of Rozeta cells set to trigger every note on every synth you have instantaneously? Or a single note arpeggiated 100 times?

Sign In or Register to comment.