Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.Download on the App Store
Audiobus is the app that makes the rest of your setup better.
audio routing black magic [linking audio between all kinds of different hosts]
I am still looking for a workflow I am most comfortable with. Every iPad app seems to have its own niche strength. Xequence 2, Cubasis, AUM, Loopy Pro and Drambo are all apps I gravitate to for organizing my music. There are some things keeping me from going full Loopy (no MIDI, no automation yet) or full Drambo (no continuous timeline) standalone yet. And when I use them as AU you cannot use other AU’s in them..
Today I found that @Jorge ’s Visual Swift transmitter and receiver AUs allow me to use the Drambo and Loopy standalone apps as audio multi-out apps which I’m really enthousiastic about. I haven’t really tested it and you might hear some crackling here and there. Maybe it’s already quite known, but I thought I’d share it here with you girls and guys in the hope contributing a little.
If you don't need some particular function of AUM, you might be able to reduce some crackles by using just Loopy and Drambo without AUM.
And then ‘guiding’ everything to Visual Swift?
Is there some reason not use Loopy Pro or Drambo as the central mixer? What is AUM bringing to the Loopy Pro + Drambo party?
I’m missing a MIDI/automation continuous time-line and I’am experimenting with different combinations to see where I can get. It’s not that this is a well thought out solution. I was kind off surprised that this was possible at all.
Was experiencing some obstacles driving Drambo from Xequence with the same ease as @wim ‘s (if I am correct) Xequence/GR-16 combo.
Just trying things out..
edit: I guess I also want to be able to mix all the different tracks together while making the song. I like to apply mixbus/master effects from very early on.
In case you don't know, Loopy has mixbus and similar level of automation possibilities to AUM. I only mention it because if you can accomplish the same capabilities with Loopy or Drambo as the mixing host, you will get more out of the CPU and more CPU memory available. AUM is great, but you may get better performance without it.
If you detail these, I may be able to help iron out the problems with that approach if you like.
I maybe should mention that right now Loopy Pro has a little Ableton Link glitch which hopefully Michael will get straightened out soon.
One big advantage for me of using Loopy Pro as the mixing host is the ability to "idle" plugins when not in use. As someone with older hardware, this is a huge benefit for being able to use a lot more than my iPad could otherwise handle.
If using Drambo and Loopy Pro is the goal, my approach would be to use Loopy as the host with Drambo AU inside it. In that configuration I don't think I'd miss the ability to use AUs in Drambo since they can be hosted in Loopy but still be driven from Drambo via MIDI out.
This could easily go the other way (Loopy AU in Drambo), but for me, the smoother workflow to commit to audio to conserve processing power, while still being able to back out and tweak things makes Loopy the more sensible host. ymmv
I've not done any experimentation using X2 for sequencing and timeline duties using, Loopy, and Drambo. Perhaps I should...
That’s a good idea! Hadn’t thought of it that way. Ideally, what I’m after is automation ‘straight from the AU knobs’ as in Cubasis or Drambo. No assigning stuff in between..
AUM and Loopy are both the same in that regard -- neither supports that sort of automation recording directly.
Thank you for the offer! I will see if I can describe what I’m running up against later this week. I still need to wrap my head around what it is I’m running up against.
edit: turned out I had forgotten an essential thing about multi-track recording in Xequence..
LP hosting Drambo is the workflow I’m currently trying to iron out. Loopy is such a liberating place to make music I’m completely invested in it, but the power of new Drambo has got to be in there too, at the very least as a midi sequencer.
Can’t decide whether it should be called Droopy or Lambo 🤔
Using Xequence to drive Drambo allows for quick automation assignment: map the parameter from the AU in Drambo by tweaking it. You now have a knob for the parameter in the rack. MIDI-learn that knob with a Xequence CC. You’re there.
DROOPY WILL RULE THE WORLD!
Does loopy import channel saves of midi mappings. Do I actually need to map most synths and they be there, on controller. When synth/channel is loaded. Like I wont tweak the screen again and do I discard synths, that arnt just knobs but wierd buttons etc ( thats the question )
If I need everything to be mapped and start tweaking synths generally via controller. I think it will need to be Drambo on ipad with drums and samples. Then aum on other ipad with loopy au. That means no automation etc for synths for the sake of controllers. Or maybe I wont map many parameters anyway but I need to make decision as all I do. Is map systems.
Since Loopy Pro doesn’t support multi-out I guess it makes more sense to host Drambo in Loopy.
In my case I’d use Loopy to record guitars, bass, vocals, etc… those tracks definitely need their own processing and benefit from Auv3 plugins.
Drambo could do drums, synths and plain midi to drive other synths hosted in loopy. You could solo the drums in Drambo and record that a loopy clip for further auv3 plugins… You’d need to map some “fake” clips to trigger Drambo scenes/patterns.
Seems like a good amount of setting up, but powerful enough to make it a starting template.
What kind of black magic are you using to send an audio signal between two hosts, @prtr_jan ?
I'm trying VisualSwift, and I'm only able to send the audio inside a same host.
Please, can you explain what's the trick?
A detailed setup guide, if you have the time, will be greatly appreciated.
Haha, it did feel like magic indeed.
Are you aware of the separate transmitter and receiver audio units?
Put a TX AU at the end of the audio chain you want to send.
An RX one where you want to have the audio coming out of the TX in the ‘receiving DAW’ for instance.
Press a letter of the alphabet in the TX AU you want to sent from (you can set up different audio links using the different letters).
Hope that helps as a starter.
Subscribed to thread…
@Pynchon apart from my answer above: in the video at the start of this thread you can see two tx au’s in two loopy tracks (a vocal and a drumloopsample): that’s transmitters A and B (you can see the letters in the respective au’s, you need to press the letter buttons). In Drambo you can see the C tx au (transmitting a Grooverider kick).
In AUM you see 3 rx au’s with corresponding letters receiving the audio from within loopy and drambo.
You can also host loopy in Drambo for that I guess. In the example below I have 3 loopy au’s in one drambo track. The same project is loaded in all 3 loopy au’s, but in each instance 2 different color groups are muted (there’s 3 loops playing: a drumloop, a synthpad and a vocal). Each loopy get’s one bus in layer mixer. (The synthpadbus has a koala fx on it with a filter effect (mapped parameter, MIDI learned and driven by Xequence). The individual loopy samples are also played from within Xequence after MIDI learn in Loopy).
I really want Xequence’s timeline and editing flexibility.
edit: hmmm, hadn’t thought that through. you need your audio in for your guitars, bass, vocals, etc.., I actually don’t know if you can use loopy as an effect (to get the audio in) in the same situation as in the video..
So far had limited, crackly sucess by sending from Loopy to Aum. Found that the receiving works only if I am watching from Loopy, if I open Aum (or even come out of Loopy to open Aum) my transmitted signal changed to a pulse wave ugh
As soon as I go back to Loopy, transmitted signal returns as normal.
Interestingly, in the Loopy->Aum config, vis swift channel A wouldn’t work for me at all.
Just tried sending from Aum to Loopy, problems all solved. Channel A working fine. Sending, receiving as expected.
2018 ipad pro 11”, ios 15.5
Great thread btw!
I’m on a 4th gen Air (2020) iOS 15.1.
If I only use one ‘Swift audio link’ I have no crackling problems.
Single note on vhs synth
Seems ok, although when I ‘come out’ of Loopy (ie swipe up at base of screen) the final croak of the audio is a bit gnarly
Do you background audio on in Loopy? That is probably why it is only working for you when loopy is in foreground.
System settings-> play in background-> toggled off - leads to single tone pulse wave, can still be chanelled through effects chain etc but not hugely useful
System settings-> play in background-> toggled on - leads to correct transmission into AUM
Nice one, thanks. Already a couple of beers in, no chance of thinking about this clearly now
Thanks for the comment. Just tried Loopy inside Drambo, you need to use the Processor audio unit to get audio into loopy. The audio unit Generator doesn’t seem to be able to record audio. I like your idea of using multiple loopy instances, I don’t know if it’ll be heavy on the cpu and how well it’ll go.
Using midi learn in loopy you can map clips to Drambo notes which I guess is the way to go…
What I can’t seem to do is map Loopy parameters to Drambo, using the Drambo “map” doesn’t seem to do anything.
Anyone using loopy in drambo2 that can comment on how they set it up?
Assign the parameters to widgets and then MIDI-learn those widgets?
Many thanks, @prtr_jan !
My mistake was that I was putting the receiver in the instrument slot of AUM, instead of using the FX slot.
Putting up the transmitter in the FX chain of a host (Drambo in this case), and the receiver also in the FX slot (AUM), it works flawlessly.
The most important thing, this can be used to transmit audio between Studiomux and AUM.
In the latest version of Studiomux, the option of processing effects from your computer, was generating a distorted audio.
Now I can send an audio track from Bitwig in my Mac to Studiomux in the iPad. Send the audio to AUM via VisualSwift, applying effects.
Then I connect the AUM audio signal to Studiomux, using a second channel. And then from Studiomux to Bitwig.
You have solved a problem that I had been trying to solve for months!
Pure audio routing black magic.