Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
This! ^^
To be honest I'd like to see AudioBus slowly evolve into a DAW with integrated MIDI sequencer. They seem to care most about iOS musicians and I'm sure they could be the best DAW
I loaded up a couple of synths and a drum machine, hit play and they all played together. That's all I've been looking for.
I'm sure the Audiobus team could make a fine DAW if they decided to, I wouldn't want to see it come at the expense of displacing Audiobus.
Just a small little tab at the bottom called 'sequencer'
In all seriousness though, it's farfetched so I'm not expecting it
http://musicappblog.com/audiobus-3-review/ comments on this question.
This is spot on.
Never expect the spanish inquisition.
Our weapons are fear and surprise
Please! Please! Please! Please!!!!!!!
And ruthless efficiency
Okay, you seem to be the guys I’ve been looking for
My touring stage setup is itrack dock wth softstep 2 and I’m wanting to start using AltiSpace and Kosmonaut as they are so much better than the delays and verbs in Bias.
So far, Using AUM I have Bias FX and two sends on one channel then a channel each for delay and reverb. How can I add Loopy to this using AB3?
Use AB3 as the master host. Set Bias as the first channel input then output to Loopy. On the next channel set Loopy as the input then output to AUM. Do all your AUv3 effects, sends and mixing in AUM. Best part is, saving a session in AB3 will recall Loopy and AUM states so you don’t need to save elsewhere.
This routing Audio to Loopy in AudioBus and then using Loopy as Audio Sender into AUM, made me curious. I never came to the idea to use an App both as Audio Receiver and Audio Sender in AudioBus:
Works.
What I cannot get my head around is, why would you do this?
What role plays Loopy in this setup?
Sending Audio finally to AUM and using effects there, you would surely also record in AUM.
Or?
ymmd
Yeah, you could just add those effects in Audiobus, I would think.
AUM has easier routing, plus easier midi setup, if you needed those.
The OP was describing a ‘live’ setup. In that case, I would use the least amount of apps possible, so no AUM, just add FX in Audiobus.
Or maybe add apeMatrix, too, and get really tangled up in Virtual Spaghetti.
there is a pink elephant in the room no one is talking about
latency compensation
AUM and AB dont do this
How to check that?
easy, take an instrument and drop some fx on it and the same without fx ...
I noticed when I had two different drum things running at the same time one with and one without fx chain ...
I think it went down into the audiobus buglist as "this doesnt make me want to shake my booty bug"
AUM does do latency compensation.
oh, it does?

does it work?
my main tool is ab
so I compensat with
a) use same fx everywhere
b) no fx at all - do everything in the instruments
I’m not sure 🤔 how sample accurate it is but AUM does do latency compensation.
In scenario A, are you adding the same FX to all tracks, but only processing on the tracks that require it, and the other tracks have the FX in a pass-thru type of setup to just add matching latency?
🤦🏻♀️
thats an interesting question
Im not sure how bypass is actually handled in ab
so most of the time I just add the same au on everything
if it has a dry wet just keep it at 1% wet if fx is unwanted on track as bypass may be handled different in different fx
Yes, that is more of what I meant. Like the FX is active, but minimal to zero actual processing, 100% dry. Just pass thru to add latency.
Pretty genius solution, @Max23. I dig it.
AUM does latency compensation, both for latency added by effects, and for hardware round-trip latency when recording live audio.
Of course, the effect must report its latency to the host, otherwise the host can't compensate for it. For AUv3 there's an API for that. For IAA, there's not, but I invented my own: http://lijon.github.io/iaa_latency_comp.html
its a big can of worms!
how do I know if that api is in that au if I didn't test it or asked?
how do I know if some iaa inserted your workaround if I didn't test it?
you see what I mean, its a big black box for users only to find out the hard way
some AUs advertise now with a zero latency mode, but that usually is some other algo to save cputime and not the whole enchilada ...
yes, it can be tricky...
Recently I recorded a guitar connected to an iCA4+ on 2 iPads (each running a different version of JamUp) simultanously in AUM with hardware latency compensation on.
The tracks were 3ms off, maybe one JamUp was set to low latency, the other to normal, possibly different IOS versions (don't remember).
As @j_liljedahl mentioned: if something isn't reported, it can't be handled.
In this very simple example you can at least check with reasonable effort - but things may grow quickly with complexity.
If I had AUM, could I use the staff editor in Notion and route it’s info to synths in other apps? It vexes me that neither Auria Pro nor Cubasis features even a basic staff editor. I find I putting notes on a grid to be overly cumbersome. If AUM enabled me to do just this one thing it’d be worth the price to me.