Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Use of AU vs IAA(stand alone)?

I ask for this reason.

I watched Jakob's video and I think something occurred to me.

I am stuck in my ways, yes.

But, something else.

AU has changed my use of iOS for music making. I use it less in some ways.

I use the iPad as an instrument, so the AU format is sort of annoying sometimes. I don't like resizing of AU and other related things.

I use the AU for Fx, yes. That's cool.

But for synth and things like an instrument, I like full size apps.

Am I alone?

Do you desire to have IAA/FULL SIZE option for your apps?
  1. Do you desire to have IAA/FULL SIZE option for your apps?42 votes
    1. Yes
      52.38%
    2. No
      47.62%
«13

Comments

  • I’ve answered ‘no’, but what I really mean is that I like AUs that are full size. I never use IAA unless there’s literally no other option. It’s too unwieldy, it doesn’t state save, and it gets in the way of doing things because too much time is spent messing about trying to get everything working as it should do.

  • @audio_DT said:
    I’ve answered ‘no’, but what I really mean is that I like AUs that are full size. I never use IAA unless there’s literally no other option. It’s too unwieldy, it doesn’t state save, and it gets in the way of doing things because too much time is spent messing about trying to get everything working as it should do.

    I use some iOS devices with 1 app as an instrument or audio signal depot or origin so I guess my use is unique to some.

    Thanks for your feedback!

  • edited September 2019

    IAA is deprecated by Apple.
    AU is the future.

  • @MobileMusic said:
    IAA is deprecated by Apple.
    AU is the future.

    That is original.....

    Yes, that is the point of this thread and the thoughts on it in regard to individual usage.

    But thanks for the breaking news.... ;)

  • edited September 2019

    If you use your iPad as an instrument, you don't need either IAA or AU... unless I'm totally misunderstanding you 😉

    (edit: you probably mean AU apps that do not have the ability to be used standalone at all (i.e. don't come with a container app that essentially just hosts the AU full-screen and uses the default audio input / output). In that case, I'd agree that every AU app should offer that 🙂)

  • edited September 2019

    I love AU performance/stability but really hate floating windows and resizing on small screens...so IAA when possible.
    Hopefully @Michael will allow 'true' full screen for some AUs (with the help of the classic AB sidebar for app switching)

  • +1 on the iPhone apps like iFretless or Geoshred are much better in standalone mode. An AU fullscreen mode like Gadget gadgets fullscreen mode could be perfect.

  • Yes, standalone is mucho convenient for using the iPad as an instrument. I'm thinking more and more that this is what I wish to do. I find AUs can be small and I often forget that they can be resized! hehe

    What will we do about AniMoog?

  • @kinkujin said:
    Yes, standalone is mucho convenient for using the iPad as an instrument. I'm thinking more and more that this is what I wish to do. I find AUs can be small and I often forget that they can be resized! hehe

    What will we do about AniMoog?

    As is....lol

  • @SevenSystems said:
    If you use your iPad as an instrument, you don't need either IAA or AU... unless I'm totally misunderstanding you 😉

    (edit: you probably mean AU apps that do not have the ability to be used standalone at all (i.e. don't come with a container app that essentially just hosts the AU full-screen and uses the default audio input / output). In that case, I'd agree that every AU app should offer that 🙂)

    Got it!

    👍

  • Did anyone use the word 'deprecate' yet?

  • I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

  • If it’s not AUv3, I won’t even buy it. I like working inside of a DAW on iOS and using the apps inside of that. My workflow is completely dependent on AU and IAA is just unreliable

  • @YZJustDatGuy said:
    If it’s not AUv3, I won’t even buy it. I like working inside of a DAW on iOS and using the apps inside of that. My workflow is completely dependent on AU and IAA is just unreliable

    I am not suggesting that YOU use IAA, but many AU are super reliable -- and there are plenty of unreliable AUv3.

    I can understand why some people prefer to stick to AU, but IAA is not necessarily unreliable and for some use-cases there are workflows that involve IAA that work better than an AU-only one.

  • AUV3 when midi editing and tempo sync is important. Otherwise how could you incorporate a nice arpeggiated synth sound into your DAW? You have to be able to edit midi so that arpeggiated notes would be on time etc. IAA when using something like BIAS FX or other effects.
    I haven’t run into an AUV3 resizing issue because I don’t have many AUV3 synths yet.

  • edited September 2019

    @espiegel123 said:

    @YZJustDatGuy said:
    If it’s not AUv3, I won’t even buy it. I like working inside of a DAW on iOS and using the apps inside of that. My workflow is completely dependent on AU and IAA is just unreliable

    I am not suggesting that YOU use IAA, but many AU are super reliable -- and there are plenty of unreliable AUv3.

    I can understand why some people prefer to stick to AU, but IAA is not necessarily unreliable and for some use-cases there are workflows that involve IAA that work better than an AU-only one.

    +1 Some combos like Audiobus/Loopy or Grouptheloop/AUM works so well for example.
    On the other side I encountered more AU crashes than before since a few months, don’t know exactly why.
    I will not trust a setup full of AU for a live gig, but combos mentioned above no problem.

  • @Jimmy said:
    AUV3 when midi editing and tempo sync is important. Otherwise how could you incorporate a nice arpeggiated synth sound into your DAW? You have to be able to edit midi so that arpeggiated notes would be on time etc. IAA when using something like BIAS FX or other effects.
    I haven’t run into an AUV3 resizing issue because I don’t have many AUV3 synths yet.

    There are plenty of IAA that work where sync is important. Examples include all LUMBeat apps, Patterning. Sync with Loopy is tight. People WAY over-generalize about IAA.

  • wimwim
    edited September 2019

    @Korakios said:
    I love AU performance/stability but really hate floating windows and resizing on small screens...

    Me too, but now I use Audiobus almost exclusively, and it has freed me of window juggling while still allowing me to switch apps with ease. I'm perfectly happy with this compromise.

    (I do feel happy enough with IAA apps that have Audiobus State Saving for now though.)

  • Damn, a lot of ya'll not reading the OP thoroughly.
    @RUST( i )K uses his iPad as an instrument. Not using apps through a DAW. I totally get that because I did that (errrrr, ok still do) with Model 15, Xythesizr and Patterning. I connect my iPad directly to Ableton sometimes via Studiomux.
    So there's no screen switching or editing midi in a DAW on iOS.
    And what is up with IAA being unreliable? I understand if one says it's difficult to code (I dunno) or redundant with Audiobus. But unreliable? Lol nah.

  • wimwim
    edited September 2019

    @ph8aerror said:
    Damn, a lot of ya'll not reading the OP thoroughly.

    The rest of the OP notwithstanding, he asked specifically "Do you desire to have IAA/FULL SIZE option for your apps?" not "Do you think I'm wrong to want..."

    I think I read the OP plain enough.

  • edited September 2019

    And to back up wim, I read the OP fine. I just replied how I wanted (aka tangentially) based on what was on my mind at the moment. Oops? I know the OP, of all people, would not have a problem with this :p .

  • If I mostly made music at home integrating my ipads into a bunch of hardware and desktop stuff then I would gravitate towards the full screen instrument scheme. But for 'all on the pad' commuting/couch/crapper AU all the way.

  • edited September 2019

    AUv3 full size or not has nothing to do with IAA vs AUv3.

    AUv3 can run full size, if the host supports it, the plug-in is resizable, and if you run iOS 11 or up (auv3 size negotiation was introduced in iOS11).

    It's up to devs of hosts and plugins to make it happen really.

  • edited September 2019

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

  • One failing in the music apps is switching between instruments and editing screens. If you have hardware then everything is to hand but with the apps you have to close a window, locate the next plugin, open it, click to expand it, scroll to the point in the piano roll and resize it because it’s zoomed out again. It soon gets tedious.

  • @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

  • wimwim
    edited September 2019

    @Mark B said:
    One failing in the music apps is switching between instruments and editing screens. If you have hardware then everything is to hand but with the apps you have to close a window, locate the next plugin, open it, click to expand it, scroll to the point in the piano roll and resize it because it’s zoomed out again. It soon gets tedious.

    Not if you use Audiobus. Well, not as tedious anyway.

  • edited September 2019

    @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering if such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is requested from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

  • edited September 2019

    @horsetrainer said:
    1> @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if an audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is request from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

    To me it wouldn't really need to be a second monitor if you could run the same app on two ipads there seems to be no reason the app could not talk to itself on the other ipad.

    I posted this a while ago but it could apply to whatever.

    ———————

    Wouldn't it be cool if BM3 could be spread across multiple devices at once?

    One device would act as the 'Host', doing all the audio processing, midi etc and the other device(s) (which could be far less powerful/more affordable) acting simply as a 'Shell', which send commands to the host. Turn a knob on the shell and you see it rotate on the other device. You could jump to other screens and edit note placement while tweaking attack on another screen etc.

  • @AudioGus Can’t this kind of be semi accomplished with AUM? Idk exactly how...but I feel with enough Mozaic and Moxie we could figure it out.

Sign In or Register to comment.