Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

While we're all cooped up inside, here're some Loopy Pro updates

1235746

Comments

  • Thanks for the input everyone else - I’m pretty sure one can make a compelling argument on either side, to be honest. If in doubt, sticking with the widely-accepted “release early and often” adage is probably the way to go.

  • wimwim
    edited March 2020

    For me, just knowing MIDI looping is also a priority is good enough. I’m patient. B)

  • @Michael said:

    @McD said:
    Both have their own clocks so the MIDI sent into AUM can't be synced to launch audio tracks with any accuracy.

    Actually, I don’t think that’s the case, at least not as I understood it.

    I am hardly a reliable source. But there's some heat around "song position" with accuracy.

    Jonatan knows what he’s talking about so maybe something’s been lost in translation.

    Yes. Pulling him into this was not a good idea because he looks for mentions and this is noise for him. I suspect he's looking at adding a MIDI looper or syncing his file players with
    controls for entering at any point in the wave file... start/stop, etc.

    If you can just sync audio with Xequence you'd have a winner.

    I’ll get in touch with @SevenSystems to talk sync with AB, I’m sure there’s something can be done.

    Put that on the 1.1 roadmap and ship this according to your current design... beta, fix and ship. Keep the testing team small and quiet. No more Drambo's premature-ejaculation
    porn needed.

    Go dark and code. Unless you want the Buzz... in which case I have boxes of bees handy.
    Quarantined with Boxes of Bees handy?

  • @wim said:
    I’m patient. B)

    You forgot the article in the middle of the sentence: "a" B)

  • @Michael has pitch shift effect really playable low latency?

  • edited March 2020

    @Janosax said:
    @Michael has pitch shift effect really playable low latency?

    Pitch shift is pretty much impossible to do in very low latency, because it needs to be done in large-ish blocks because of the frequency transform. Right now I measured the playthrough at about 60-90ms depending on the magnitude of the pitch shift, that's at 128 frames on my iPad Air 2, which gives about 10ms of playthough latency with no effects.

    That's just my first cut, though, I might be able to do better in certain cases.

    Edit: Just did a little looking around, turns out I'm not 100% correct, there is the possibility of doing it in the time domain at low latency. Doesn't really sound as good, vs the system I'm using (guitar only)

  • This! looks amazing!
    Congrats on parenthood!
    This will be an instabuy for me.

  • @MrSmileZ said:
    This! looks amazing!
    Congrats on parenthood!
    This will be an instabuy for me.

    Cheers!

  • @Michael said:

    @Janosax said:
    @Michael has pitch shift effect really playable low latency?

    Pitch shift is pretty much impossible to do in very low latency, because it needs to be done in large-ish blocks because of the frequency transform. Right now I measured the playthrough at about 60-90ms depending on the magnitude of the pitch shift, that's at 128 frames on my iPad Air 2, which gives about 10ms of playthough latency with no effects.

    That's just my first cut, though, I might be able to do better in certain cases.

    Edit: Just did a little looking around, turns out I'm not 100% correct, there is the possibility of doing it in the time domain at low latency. Doesn't really sound as good, vs the system I'm using (guitar only)

    Yes there are much lower latency AU pitch shift really playable like VoiceRack FX from TC Helicon, but it’s abandoned and has some bugs.

  • @Janosax said:

    @Michael said:

    @Janosax said:
    @Michael has pitch shift effect really playable low latency?

    Pitch shift is pretty much impossible to do in very low latency, because it needs to be done in large-ish blocks because of the frequency transform. Right now I measured the playthrough at about 60-90ms depending on the magnitude of the pitch shift, that's at 128 frames on my iPad Air 2, which gives about 10ms of playthough latency with no effects.

    That's just my first cut, though, I might be able to do better in certain cases.

    Edit: Just did a little looking around, turns out I'm not 100% correct, there is the possibility of doing it in the time domain at low latency. Doesn't really sound as good, vs the system I'm using (guitar only)

    Yes there are much lower latency AU pitch shift really playable like VoiceRack FX from TC Helicon, but it’s abandoned and has some bugs.

    Oh yeah? Did it sound any good? What kind of latency we talking about?

    I'm definitely interested in improving things if it's possible, but I don't wanna go too far down the rabbit hole without knowing there'll be something worthy at the end =)

  • hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

  • @reasOne said:
    hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

    No, I'm keeping that largely to myself for now to give me some flexibility..and avoid embarrassment =)

  • @Michael said:

    @reasOne said:
    hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

    No, I'm keeping that largely to myself for now to give me some flexibility..and avoid embarrassment =)

    haha don’t want another drambo thread ? 🤣

  • @reasOne said:

    @Michael said:

    @reasOne said:
    hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

    No, I'm keeping that largely to myself for now to give me some flexibility..and avoid embarrassment =)

    haha don’t want another drambo thread ? 🤣

    Haha, actually, despite @McD's disparagement, I could definitely use the buzz! But I'll build up to it.

  • edited March 2020

    @Michael said:

    @Janosax said:

    @Michael said:

    @Janosax said:
    @Michael has pitch shift effect really playable low latency?

    Pitch shift is pretty much impossible to do in very low latency, because it needs to be done in large-ish blocks because of the frequency transform. Right now I measured the playthrough at about 60-90ms depending on the magnitude of the pitch shift, that's at 128 frames on my iPad Air 2, which gives about 10ms of playthough latency with no effects.

    That's just my first cut, though, I might be able to do better in certain cases.

    Edit: Just did a little looking around, turns out I'm not 100% correct, there is the possibility of doing it in the time domain at low latency. Doesn't really sound as good, vs the system I'm using (guitar only)

    Yes there are much lower latency AU pitch shift really playable like VoiceRack FX from TC Helicon, but it’s abandoned and has some bugs.

    Oh yeah? Did it sound any good? What kind of latency we talking about?

    I'm definitely interested in improving things if it's possible, but I don't wanna go too far down the rabbit hole without knowing there'll be something worthy at the end =)

    I haven’t measured it, but RTL at 128 buffers on 7 Plus + this VRFX plugin latency must give a total latency somewhere between 15 and 20 ms, or I couldn’t play live with it. This is with their +/-1 octave pitch shift algos.

  • @Michael said:

    @reasOne said:

    @Michael said:

    @reasOne said:
    hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

    No, I'm keeping that largely to myself for now to give me some flexibility..and avoid embarrassment =)

    haha don’t want another drambo thread ? 🤣

    Haha, actually, despite @McD's disparagement, I could definitely use the buzz! But I'll build up to it.

    hehe def agree! once people start seeing more of it , either videos or ss it’ll build up a lot of hype! if you need a small elite group of beta testers tho i have nothing but time right now! haha

  • @reasOne said:

    @Michael said:

    @reasOne said:

    @Michael said:

    @reasOne said:
    hey micheal do you have a guesstimate when you’ll be able to release this or are you holding out for now until you get closer

    No, I'm keeping that largely to myself for now to give me some flexibility..and avoid embarrassment =)

    haha don’t want another drambo thread ? 🤣

    Haha, actually, despite @McD's disparagement, I could definitely use the buzz! But I'll build up to it.

    hehe def agree! once people start seeing more of it , either videos or ss it’ll build up a lot of hype! if you need a small elite group of beta testers tho i have nothing but time right now! haha

    Hehe, thank you!

  • Scuse me for butting in but why does time-stretching have to be in real-time?

    If this is a looper then the result of the time-stretching algo could be fed into the loop shifted forwards in time and only when it comes to play back from the start of the loop, does it play the shifted version?

    I fully understand that I could be missing something though..

  • @Jocphone said:
    Scuse me for butting in but why does time-stretching have to be in real-time?

    If this is a looper then the result of the time-stretching algo could be fed into the loop shifted forwards in time and only when it comes to play back from the start of the loop, does it play the shifted version?

    I fully understand that I could be missing something though..

    Yeah, that too. The live thing is important for performing though.

  • edited March 2020

    As example the pitch shift effect in N.I. Replika-XT works almost at zero (or really not noticeable) latency. It makes a giant difference for live play indeed and things just glue perfectly.
    But the cpu use is intense as well.

  • edited March 2020

    @Michael said:
    And here's Loopy hosting an AU as part of an effect preset (in a chain with an EQ). The parameters of the AU can be bound to X or Y on the effect pads on the main screen (as many parameters as you like bound to XY), and controlled via a MIDI controller too.

    Loopy will automatically load and manage multiple instances of the AU so it can apply it via multiple sends and inserts, as needed.

    Glad about send to AUfx option, I might just about be able to recreate my current Audiobus setup. Shhhhh don’t tell the developer.

    @Michael said:
    Yeah, you guessed wrong – it’s a colossal job!

    I’d prefer the two (audio and midi apps) to be separate. The last thing you want is an app with features you don’t use, other than being a more complex beast to maintain.

  • edited March 2020

    The main feature midi looping provides is quantization, which is really needed if you want to loop tight drums and basses with AU or external gear. Having midi looping built in Loopy Pro will save lot of ressources, that means lower glitch free latency which is needed if you want audio monitoring with AU effects. No other iOS looper have this feature. This is why I use Ableton now. With great built in midi looper, low latency pitch shift with formant, groups looping and perfect presets midi management/loading, I could go back to iOS easily. I know @Michael is a great dev with nice ideas, he knows how to debug and listen to users, so there is a chance that Loopy Pro will be the looper I hope ;-)

    Edit : with a low latency pitch shift effect, you will also have the basis for a harmonizer. Then, Loopy Pro will rule them all!! @Michael

  • some days you want midi
    some days you want audio
    i don’t see why having them all in one app is going to make it more complex i feel like it would simply the work process myself. i dont work with audio much until i’m mixing stems, or mixing down midi, recording audio loops is fun for live jamming but im not good enough to create real songs with my results

  • @Janosax said:
    The main feature midi looping provides is quantization

    I would say the main feature of MIDI looping is that the flexibility to change notes and sounds. The biggest "problem" with audio looping is the locked-in nature of audio recordings. Repetition can be a lovely thing, but at a certain point you probably want to make a change. With audio, apart from superficial changes with effects processing, you have to record a new section. But unless you're able to play multiple instruments at once it means getting from your A section to your B section has to be done in steps, recording one instrument at a time.

    With MIDI you could e.g. transpose the notes; filter out every other note; change the sound of the destination synth; etc. And all of this could be setup and triggered automatically, or with a footswitch, etc..
    So, for example, if you record your bass and pads as MIDI and have them routed through a transposing scale quantizer, you could modulate both of those while still playing your lead instrument allowing you to make a big change from your A section to your B section all at once.

  • edited March 2020

    A> @aplourde said:

    @Janosax said:
    The main feature midi looping provides is quantization

    I would say the main feature of MIDI looping is that the flexibility to change notes and sounds. The biggest "problem" with audio looping is the locked-in nature of audio recordings. Repetition can be a lovely thing, but at a certain point you probably want to make a change. With audio, apart from superficial changes with effects processing, you have to record a new section. But unless you're able to play multiple instruments at once it means getting from your A section to your B section has to be done in steps, recording one instrument at a time.

    With MIDI you could e.g. transpose the notes; filter out every other note; change the sound of the destination synth; etc. And all of this could be setup and triggered automatically, or with a footswitch, etc..
    So, for example, if you record your bass and pads as MIDI and have them routed through a transposing scale quantizer, you could modulate both of those while still playing your lead instrument allowing you to make a big change from your A section to your B section all at once.

    That’s also true but for producing, and the first purpose of a looper is live recording. For live looping you will prefer to use audio for stability and ressources management. But even an acoustic instrument player will eventually need to loop some drums or other instruments to build a song live. It’s super hard to loop drums and groovy bass tight without quantization, so it’s a huge help here. Moving notes on a grid is hardly doable on stage while playing and looping. For production, I usually use Atom for midi and then record resulting audio straight in GTL. But it’s not manageable live, especially with loops grouping.

    Edit : I agree all other ideas you mentioned are great ones, especially transposition. Also, AudioLayer is the best AU for that kind of stuff IMO, it’s ressources friendly, have built in effects, presets change though host is super fast, and it streams from disk. Using Rozeta live in a looper could be super interesting too with its follow actions and randomization.

  • Highly likely we will see the Eventide H910 on iOS. The Loopy of harmonisers!

    Ableton + iOS is an amazing combination, Link 2 and Project Export means no cables or hassle.

    Confused about the either/or approach that people have about this.

  • edited March 2020

    @BlueGreenSpiral said:
    Highly likely we will see the Eventide H910 on iOS. The Loopy of harmonisers!

    Ableton + iOS is an amazing combination, Link 2 and Project Export means no cables or hassle.

    Confused about the either/or approach that people have about this.

    It’s all about using a looper for production vs using it live in real-time. Two totally different things, especially when using it on stage where you can’t take the time for export and other production processes.

  • edited March 2020

    Still confused 🤷‍♂️

    Genuinely

    Ableton live is good at the live thing.

    So is Loopy.

  • edited March 2020

    @BlueGreenSpiral said:
    Still confused 🤷‍♂️

    Genuinely

    Ableton live is good at the live thing.

    So is Loopy.

    Imagine you want to loop a first drum part with tight timing and you’re not good enough player for that. You will need to quantize it. Same for most bass parts if you’re not a bassist and synths/piano parts if you’re not a keyboard player. Imagine also you want some variety in your song structure, then you will need loops grouping like GTL or Live. You can use Atom or Gadget for midi looping but you will not be able to group anything in sync with the audio looper. If I want to use GTL or Loopy and create all my loops live, I will certainly use something like Noir for drums and loop very basic basses, or my whole song with the saxophone and no quantization on rythm foundations will feels floaty rhythmically. This is why I use Ableton right now. Or GTL iOS with my pre-produced own loops, as in that setup I only loop live sax. But then, the cool « built from start » song thing will misses me. So Loopy Pro can be an answer for this. Having something light and mobile like an iPad is interesting on stage, especially because while performing it’s better to have a backup. Two iPad (or iPhone) is much easier to travel with than two laptops.

  • Just thinking about improvising a liveset from scratch in front of an audience is giving me panic 🤣

This discussion has been closed.