Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Use of AU vs IAA(stand alone)?

2

Comments

  • edited September 2019

    @AudioGus said:

    @horsetrainer said:
    1> @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if an audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is request from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

    To me it wouldn't really need to be a second monitor if you could run the same app on two ipads there seems to be no reason the app could not talk to itself on the other ipad.

    I posted this a while ago but it could apply to whatever.

    ———————

    Wouldn't it be cool if BM3 could be spread across multiple devices at once?

    One device would act as the 'Host', doing all the audio processing, midi etc and the other device(s) (which could be far less powerful/more affordable) acting simply as a 'Shell', which send commands to the host. Turn a knob on the shell and you see it rotate on the other device. You could jump to other screens and edit note placement while tweaking attack on another screen etc.

    That certainly is the same general idea!

    Edited to add....

    But with the coming of Sidecar, I think it might seem like Sidecar might be the "easier" way to make such a system work?

    I think it might take a programer, with knowledge of Sidecar, to say just what the probability is that Sidecar might be made capable of being used for this purpose.

  • @mjcouche said:
    @AudioGus Can’t this kind of be semi accomplished with AUM? Idk exactly how...but I feel with enough Mozaic and Moxie we could figure it out.

    Midi control is always nice of course but this is a one to one ui parity with no setup.

  • edited September 2019

    @horsetrainer said:

    @AudioGus said:

    @horsetrainer said:
    1> @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if an audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is request from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

    To me it wouldn't really need to be a second monitor if you could run the same app on two ipads there seems to be no reason the app could not talk to itself on the other ipad.

    I posted this a while ago but it could apply to whatever.

    ———————

    Wouldn't it be cool if BM3 could be spread across multiple devices at once?

    One device would act as the 'Host', doing all the audio processing, midi etc and the other device(s) (which could be far less powerful/more affordable) acting simply as a 'Shell', which send commands to the host. Turn a knob on the shell and you see it rotate on the other device. You could jump to other screens and edit note placement while tweaking attack on another screen etc.

    That certainly is the same general idea!

    Edited to add....

    But with the coming of Sidecar, I think it might seem like Sidecar might be the "easier" way to make such a system work?

    I think it might take a programer, with knowledge of Sidecar, to say just what the probability is that Sidecar might be made capable of being used for this purpose.

    As far as I can tell sidecar is essentially just like Duet in that your ipad becomes a second desktop monitor.

    What I am talking about doesn't have anything to do with streaming video or audio.

  • @Mark B said:
    One failing in the music apps is switching between instruments and editing screens. If you have hardware then everything is to hand but with the apps you have to close a window, locate the next plugin, open it, click to expand it, scroll to the point in the piano roll and resize it because it’s zoomed out again. It soon gets tedious.

    For me PERSONALLY the ipad workflow still beats getting up from the chair walking to my next synth at the other desk while tripping over midi cables and power lines etc. Then getting back to my sequencer/computer and back and forth.
    I know there are limitations, but my creative workflow just works with an ipad.
    Call me crazy but i even hate midi keyboards now. They are like a pen and paper. I have to apply pressure which i hate. LOL. Rather type digitally.

  • edited September 2019

    @AudioGus said:

    @horsetrainer said:

    @AudioGus said:

    @horsetrainer said:
    1> @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if an audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is request from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

    To me it wouldn't really need to be a second monitor if you could run the same app on two ipads there seems to be no reason the app could not talk to itself on the other ipad.

    I posted this a while ago but it could apply to whatever.

    ———————

    Wouldn't it be cool if BM3 could be spread across multiple devices at once?

    One device would act as the 'Host', doing all the audio processing, midi etc and the other device(s) (which could be far less powerful/more affordable) acting simply as a 'Shell', which send commands to the host. Turn a knob on the shell and you see it rotate on the other device. You could jump to other screens and edit note placement while tweaking attack on another screen etc.

    That certainly is the same general idea!

    Edited to add....

    But with the coming of Sidecar, I think it might seem like Sidecar might be the "easier" way to make such a system work?

    I think it might take a programer, with knowledge of Sidecar, to say just what the probability is that Sidecar might be made capable of being used for this purpose.

    As far as I can tell sidecar is essentially just like Duet in that your ipad becomes a second desktop monitor.

    What I am talking about doesn't have anything to do with streaming video or audio.

    If I've correctly understood what I've read about SideCar. It allows OS-X Catalina to utilize an iPad's Touch Screen User Interface via "AirPlay Communication" as a "Control Device" for Applications running on OS-X Catalina.

    (For Example, using an iPad with Apple Pencil as a drawing tablet to input "Drawing DATA" to an Image Editing Program running on the Mac.)

    For SideCar to accomplish that type of functionality via AirPlay. I would guess that Apple is using the AirPlay Communication Protocol to exchange more than just Audio and Video Data with the iPad... ?

    What I'm thinking is... Why shouldn't the SideCar protocol also be capable of allowing two iPads to link with one another similarly?

    For example.... iPad #1 is running a DAW like Cubasis. The user then sets up iPad #2 to link with iPad #1 Via SideCar.

    Imagine Cubasis has been updated to work with SideCar, and an Audio Unit Synth has also been updated to work with SideCar.

    In theory... The Cubasis user could set a preference in Cubasis, telling Cubasis to open any SideCar compatible Audio Units on the screen of iPad #2, using the SideCar protocol.

    As long as the SideCar protocol "might be made capable of" providing complete control for all of the necessary parameters in the Audio Unit's Use Interface. I think it might potentially provide a new kind of feature for use with iPadOs devices.

    So what I'm suggesting here, is not anything that I think iPadOs might be capable of doing now. But rather a kind of functionality that hypothetically might be available with SideCar in the future.... Should Apple desire this imagined feature to exist.

    One of the existing features of AirPlay is Audio transfer to another Apple device.

    That makes me consider a hypothetical future possibility.... That iPad Music Apps with SideCar compatibility, might not be limited to just User Interface Control. But Perhaps they might exchange Audio and MIDI DATA wirelessly between two iPads as well?

    All I'm doing here is speculating about... "what might happen in the future".

  • @horsetrainer said:

    All I'm doing here is speculating about... "what might happen in the future".

    Absolutely, very cool stuff awaits and this is just the beginning.

  • @Jimmy said:

    @Mark B said:
    One failing in the music apps is switching between instruments and editing screens. If you have hardware then everything is to hand but with the apps you have to close a window, locate the next plugin, open it, click to expand it, scroll to the point in the piano roll and resize it because it’s zoomed out again. It soon gets tedious.

    For me PERSONALLY the ipad workflow still beats getting up from the chair walking to my next synth at the other desk while tripping over midi cables and power lines etc. Then getting back to my sequencer/computer and back and forth.
    I know there are limitations, but my creative workflow just works with an ipad.
    Call me crazy but i even hate midi keyboards now. They are like a pen and paper. I have to apply pressure which i hate. LOL. Rather type digitally.

    Its all about muscle memory for me. The more I use BM3 I forget more and more about the individual taps, swipes and screen switching. More and more I can just focus on the music in a mellow seamless creative bliss.

  • I rarely use apps out of AU mode these days, instruments included. Some apps take advantage of full screen mode in host, more will follow.

  • @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    Every app could "easily" support it via OSC messages . The master app on one iPad and on the second device the GUI sending/receiving OSC. If the dev could easily port the GUI to other platforms then even a cheap Android tablet could serve as control interface.
    Reaper DAW does this for VSTs ,so there might no need for the app on iOS to send messages ,but the host could send those messages (more technical , no clue if it's possible)
    But the OSC control for sure it's possible. Think of it like a controller utility app ,but instead of making your own controls the GUI is the same with the master app and the controls pre-assigned...

  • I really like the full sized standalone apps sometimes, apps like midi sequencers with Ableton link are great.

  • @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    BINGO!

    I used to comment back in the day....I don't get the AU fervor ......still not helping productivity or performance aspect of making music ........

    IF I WANT TO FIDDLE WITH SCREENS AND CLOSING OUT WINDOWS I WILL USE MY MACBOOK! LOL

    The point of iOS for me was it was DIFFERENT and MORE USEFUL.

    Seems like it is now just the smaller less useful version of full sized computer programs or at least that style......

    JUST MY OPINION........

    But you are fucking right!

  • @AudioGus said:

    @horsetrainer said:
    1> @AudioGus said:

    @horsetrainer said:

    @oat_phipps said:
    I'm coming to realize only after several years that the switching screens and stuff is really, really tedious and starting to think that maybe, just maybe that hardware is worth the 100x cost for good reason--there is nothing in the way of flow. Each of those 2-second (or more) switches + microtweaks add up. Each one diminishes your full attention, and it really adds up when capturing the feel in the moment is essential.

    I like the affordability of iOS. I agree that working with multiple music Apps can be frustratingly tedious at times.

    I tend to think the root of the issue all comes down to the logistics of working on one small touchscreen.

    If I try to imagine a solution, I think that solution will have to involve the use of more than one integrated touch screens.

    I'd like to have a piece of hardware that I could plug both of my iPads, and any controllers I want to use, into.

    Then have some way of designating one iPad as "Master", and the other as "Slave". There would have to be an "inter-iPad" communication protocol that would support the interconnectivity.

    The idea would be to make it so you could run a host App like a DAW on one iPad. And then while in the DAW, if you tap on an instrument track to bring up the Audio Unit window, that AU window comes up full screen on the "Slave" iPad.

    Ideally. I think such an interface would be designed so the instrument Apps would actually run on the "Slave" iPad. But I think the design should also involve a functionality where the "Slave" iPad could just serve as a "second touch screen monitor" for the Master iPad.

    There's a whole new world that could be developed around a system like that. But it would likely require Apple's cooperation to make it happen.

    I was thinking a similar thing a little while ago. I bet an app could do it via bluetooth. Particularly things like being able to have a mixer open on another ipad running the same app.

    I'm wondering such a "Second iPad Monitor System" could be implemented using Apple's upcoming "SideCar" feature?
    From what I've read, I think SideCar uses the AirPlay protocol to communicate.

    If SideCar can work between a Mac and an iPad... Then what not also between and iPad and an iPad?

    Then it's a matter of figuring out how to get the UI of a music App to appear on the SideCar-"ed" iPad to be used as a controller.

    Almost seems like it might be possible with what Apple is providing with it's SideCar protocol.

    Perhaps, if an audio units could be granted access to the SideCar protocol, they could send their UI over to the controller iPad when the UI is request from the Host App?

    There could be some kind of preference available for an audio unit, so the user could choose where the audio unit UI appears...In the Host App, or on the SideCar-"ed" iPad.

    Almost sounds so possible that it might actually happen... :)

    To me it wouldn't really need to be a second monitor if you could run the same app on two ipads there seems to be no reason the app could not talk to itself on the other ipad.

    I posted this a while ago but it could apply to whatever.

    ———————

    Wouldn't it be cool if BM3 could be spread across multiple devices at once?

    One device would act as the 'Host', doing all the audio processing, midi etc and the other device(s) (which could be far less powerful/more affordable) acting simply as a 'Shell', which send commands to the host. Turn a knob on the shell and you see it rotate on the other device. You could jump to other screens and edit note placement while tweaking attack on another screen etc.

    EXACTLY!

    That I why I use AUDOBUS and AB remote in a way......

    My whole idea is music making should flow and be connected in a logical way.....not convoluted circuitous route or logic.

    The issue is Apple making things hard probably I assume.....for developers........you know how it is with Apple products and using them like actual open ended computers that you can use as you wish.....lol

  • Do you know what would SOLVE ALL IOS MUSIC MAKING ISSUES?

    A B L E T O N for my iPAD!

  • @RUST( i )K said:

    @audio_DT said:
    I’ve answered ‘no’, but what I really mean is that I like AUs that are full size. I never use IAA unless there’s literally no other option. It’s too unwieldy, it doesn’t state save, and it gets in the way of doing things because too much time is spent messing about trying to get everything working as it should do.

    I use some iOS devices with 1 app as an instrument or audio signal depot or origin so I guess my use is unique to some.

    Thanks for your feedback!

    I’m with you bro!
    Even more full screen apps vs AUv3 floating mesh...
    I suppose Apple expects the garageband/Bm3 approach where AUs get integrated in the host app GUI instead floating but then it forces users into one workflow...
    I can’t believe Apple doing something like that... :trollface:

    Anyways we are in the endgame...

  • edited September 2019

    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

  • @brambos said:
    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    I don't personally care about AUv3.

    I am fine with 1 per app after all those years I got used it....LOL

    Bram...is it possible from a development side to have a minimize / maximixe like on the computer for the AU windows or is that not an option due to the system?

  • @TheDubbyLabby said:

    @RUST( i )K said:

    @audio_DT said:
    I’ve answered ‘no’, but what I really mean is that I like AUs that are full size. I never use IAA unless there’s literally no other option. It’s too unwieldy, it doesn’t state save, and it gets in the way of doing things because too much time is spent messing about trying to get everything working as it should do.

    I use some iOS devices with 1 app as an instrument or audio signal depot or origin so I guess my use is unique to some.

    Thanks for your feedback!

    I’m with you bro!
    Even more full screen apps vs AUv3 floating mesh...
    I suppose Apple expects the garageband/Bm3 approach where AUs get integrated in the host app GUI instead floating but then it forces users into one workflow...
    I can’t believe Apple doing something like that... :trollface:

    Anyways we are in the endgame...

    End of days.....lol

  • edited September 2019

    There are currently instrument apps that are both IAA & AU. It is nice to be able to open the standalone app itself and just play. It’s also nice to set it up as an AU in thru your “host”. It’s nice to be able to use that AU app thru it’s IAA functionality if I feel like it. It’s great that AU is the future and all that, it’s ironic that we are focused on the future we may be frowning on the versatility we currently have.
    I want both.

  • I would also love to see the a return to "touch midi learn".......like IAA and Audiobus old days were. As soon as I personally have to start going in to little menus and entering numbers...or doing multiple steps per knob midi learn...it is distracting to the creative process.

    Maybe some "Midi learn" standards could be adopted like say using Ableton all plug ins just hit button twist ... "learned".....

    A girl can dream

  • @brambos said:
    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    LOL. Classic! :)

  • @Simon said:

    @brambos said:
    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    LOL. Classic! :)

    Classic what?

    What do you mean?

  • @RUST( i )K said:
    Classic what?

    >

    What do you mean?

    I mean his comment about users is classic i.e. well observed and quite amusing.

  • Id like to be able to run synths, grooveboxes, drum machines etc...without having to also use a host...I only need a host once I start trying to combine things.

  • @DavidM said:
    AUv3 full size or not has nothing to do with IAA vs AUv3.

    AUv3 can run full size, if the host supports it, the plug-in is resizable, and if you run iOS 11 or up (auv3 size negotiation was introduced in iOS11).

    It's up to devs of hosts and plugins to make it happen really.

    Exactly, there's nothing in the Audio Unit framework that stops developers from creating a host app that supports full screen AU views. It's true that the UI of the AU needs to be scalable, but that's not that hard to realise with Auto Layout unless your UI is completely custom, and with SwiftUI this is going to be even easier.

  • edited September 2019

    @RUST( i )K said:
    Bram...is it possible from a development side to have a minimize / maximixe like on the computer for the AU windows or is that not an option due to the system?

    I'm confused... all hosts already allow this and most recent AUv3 plugins already do this? Or am I misunderstanding what you're asking?

  • edited September 2019

    @brambos said:
    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    I'm with @brambos if I wanted Ableton, I'd use my computer. iPad should be its own paradigm which is why I use AUM/iPad more than Ableton these days.

    If standalone/fullscreen is preferred, why not get a few iPads with cheap audio interfaces and use them as separate pieces for external hardware.

    A used iPad + $150 audio interface is still cheaper than 0coast, Minilogue XD, etc.

  • edited September 2019

    @auxmux said:

    @brambos said:
    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    I'm with @brambos if I wanted Ableton, I'd use my computer. iPad should be its own paradigm which is why I use AUM/iPad more than Ableton these days.

    If standalone/fullscreen is preferred, why not get a few iPads with cheap audio interfaces and use them as separate pieces for external hardware.

    Yeah.. don't get me wrong, I like the fullscreen mode of AU plugins a lot. I think it's valuable. But having to juggle screens is in that case pretty much unavoidable, so I simply don't understand why people complain about it. If you don't like screen switching you can use most plugins in their letterbox mode; essentially giving you direct access to the plugin and the host simultaneously. I can't think of another solution short of invoking dark magic. :)

  • @auxmux said:

    @brambos said:
    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    I'm with @brambos if I wanted Ableton, I'd use my computer. iPad should be its own paradigm which is why I use AUM/iPad more than Ableton these days.

    If standalone/fullscreen is preferred, why not get a few iPads with cheap audio interfaces and use them as separate pieces for external hardware.

    A used iPad + $150 audio interface is still cheaper than 0coast, Minilogue XD, etc.

    First. I agree with you.

    But, I think in a round about way, that is the over arching context some people are coming from.

    Many "improvements" to iOS are merely making it more and more like plug ins in Ableton in lieu of a different experience.

    I think that is the reason we in the industry are at a crossroads of determing who and how use iOS for music making and the best way to proceed. Changes will happen inevitably. But, hopefully they will be based on the user experiences and needs.

    As for getting iPad per instrument that is what I and many people do. But with the changes as Jakob indicated full size options are not a given for each app.

    You are right in theory with your proposal, but if IAA and/or Standalone versions are eliminated totally that will not be an option. That is some of our concerns. A user is then forced in to the "host" app scenario which is not always ideal or just the size of the apps orientating on the screen.

    Kai indicated how he programs and always includes the full stand alone version of each app. Which is vital for me.

  • @brambos said:

    @RUST( i )K said:
    Bram...is it possible from a development side to have a minimize / maximixe like on the computer for the AU windows or is that not an option due to the system?

    I'm confused... all hosts already allow this and most recent AUv3 plugins already do this? Or am I misunderstanding what you're asking?

    Having an option like this on apps in other words for 1 touch sizing of an app AU into full screen. Some apps have sides or bottoms not visible and user is forced to scroll somehow or adjust the scale fo the AU graphic.

    Your apps work great so this isn't an issue.....so you are the wrong developer that I would hope to be in this thread......lol

  • @RUST( i )K said:

    @auxmux said:

    @brambos said:
    Sounds like people want their cake and eat it too...

    Originally AUv3 filled half the screen, so they could be integrated seamlessly in the UI of the host. This way you don't have to do screen switching all the time. Really nice!

    But people kept whining and nagging that they want fullscreen AUv3. Now they have it and they're complaining about all the screen switching.

    I'm sure you can figure out the pattern here. Hint: the standard is not the problem :|

    I'm with @brambos if I wanted Ableton, I'd use my computer. iPad should be its own paradigm which is why I use AUM/iPad more than Ableton these days.

    If standalone/fullscreen is preferred, why not get a few iPads with cheap audio interfaces and use them as separate pieces for external hardware.

    A used iPad + $150 audio interface is still cheaper than 0coast, Minilogue XD, etc.

    First. I agree with you.

    But, I think in a round about way, that is the over arching context some people are coming from.

    Many "improvements" to iOS are merely making it more and more like plug ins in Ableton in lieu of a different experience.

    I think that is the reason we in the industry are at a crossroads of determing who and how use iOS for music making and the best way to proceed. Changes will happen inevitably. But, hopefully they will be based on the user experiences and needs.

    As for getting iPad per instrument that is what I and many people do. But with the changes as Jakob indicated full size options are not a given for each app.

    You are right in theory with your proposal, but if IAA and/or Standalone versions are eliminated totally that will not be an option. That is some of our concerns. A user is then forced in to the "host" app scenario which is not always ideal or just the size of the apps orientating on the screen.

    Kai indicated how he programs and always includes the full stand alone version of each app. Which is vital for me.

    Yeah I agree that any instruments or sound generators should come in a standalone format so that option is available when preferred. FX not so much. There are times when you want to be able to focus on one thing.

Sign In or Register to comment.