Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Question for developers: Any significant audio / MIDI changes of late / iOS 15 ?

edited April 2022 in App Development

Hello fellow devs - hoping you can help.

Can anyone confirm that AFAIK, there have been no real significant changes to audio and MIDI "stuff" in the last few iOS updates?

Almost needless to say, I'm particularly interested in anything AudioUnit related.

I'm also interested in any improvements relating to Swift / audio / MIDI APIs integration.

And finally, anyone else get the strong feeling AVFoundation will be deprecated before too long?

Seemed like there was a big fanfare (at least by Apple's music app standards) about AVFoundation back around iOS 9, but now everything is so "AU or go home" nothing seems to have happened as far as I'm aware to bridge the gap between AUv3 development, and the AVFoundation take on AU graphs and what-not. Real shame as it's kinda close.

Maybe we'll see something new at the WW (emoji / social media app) Developer Conference... :D

And, in case anyone's wondering where I've been, I've been working on other / desktop projects for a while. And if anyone's wondering why that is, well...

I did have a whole load of stuff in development for new instruments for moodunits which I wanted to be a nice surprise for y'all, but...

Apple have decided to stop hosting any new IAP downloadable content so I've had to abandon that as doing my own d/l hosting is not commercially viable for an operation of my size. So yeah... thanks for that Apple, especially as getting IAP downloads working with moodunits was surprisingly painful :'(

Just to be clear, Apple will continue to support existing downloads (who knows for how long...) but aren't accepting new content. That's my understanding anyway.

Anyway enough about me and hope someone can spare the time to chip-in, share their thoughts.

Finally, and more importantly, hope everyone is AOK!

Cheers, -Rob

Comments

  • edited April 2022

    Wow, this seems like a massive change for those developers who rely on big IAP bundles. I can’t imagine it will be feasible to host this content if you are not already a big company.

    Will it still be possible to bundle content with apps and unlock in-situ (at the expense of large initial installs)?

    Edit to add: @moodscaper good to hear from you as well and hope you are doing well. Are the desktop things you have been working on music related?

  • edited April 2022

    Hi @MisplacedDevelopment, should you feel inclined, there's quite a big thread on it here:

    https://developer.apple.com/forums/thread/701268

    And yeah it was a bit "out of the blue" for sure! :#

    And yes, desktop stuff has been a mix of Kontakt instruments (some freelance dev work and one of my own) and (hopefully, before too long) a synth plugin which I'm very excited about.

    For the desktop stuff, I'm trading under the name of Waverley Instruments:

    https://waverley-instruments.com

    On the off-chance, for anyone interested in Synthetic Materials, it might be worth holding off for a little while as we're just finishing off a video for the user sample import stuff then there'll be a discount offer / sale. ;)

  • @moodscaper : ugh! It sounds like Apple has decided the pace of IAA to AUv3 transition has been too slow and are trying to speed things along

  • edited April 2022

    @espiegel123 said:
    @moodscaper : ugh! It sounds like Apple has decided the pace of IAA to AUv3 transition has been too slow and are trying to speed things along

    Well, funny thing is, there's nothing documented about getting IAA working with AVFoundation. It's possible though, with a bit of hacking. So AVFoundation is not about IAA>

    And...

    AVFoundation effects and instruments are actually Audio Units. Plus... in order to get IAA working with AVFoundation, the "thing" you publish to the outside world is the Audio Unit representing your main output.

    My frustration is, all that stuff is working great, but it's never (AFAIK) really been developed any further than the stuff they did for iOS 9, plus a sprinkle of tweaks here and there since then. I don't think it would be a huge job to add some APIs that allow you to build instruments and effects with what's provided in AVFoundation and present that as an Audio Unit. There's loads of cool stuff you could do and never have to worry about realtime audio threads, etc., etc.

    Honestly, AVFoundation is so close to being a decent high-level framework for creating Audio Units but I suspect Apple will abandon it. I drives me mad! :smiley:

  • @moodscaper said:

    @espiegel123 said:
    @moodscaper : ugh! It sounds like Apple has decided the pace of IAA to AUv3 transition has been too slow and are trying to speed things along

    Well, funny thing is, there's nothing documented about getting IAA working with AVFoundation. It's possible though, with a bit of hacking. So AVFoundation is not about IAA>

    And...

    AVFoundation effects and instruments are actually Audio Units. Plus... in order to get IAA working with AVFoundation, the "thing" you publish to the outside world is the Audio Unit representing your main output.

    My frustration is, all that stuff is working great, but it's never (AFAIK) really been developed any further than the stuff they did for iOS 9, plus a sprinkle of tweaks here and there since then. I don't think it would be a huge job to add some APIs that allow you to build instruments and effects with what's provided in AVFoundation and present that as an Audio Unit. There's loads of cool stuff you could do and never have to worry about realtime audio threads, etc., etc.

    Honestly, AVFoundation is so close to being a decent high-level framework for creating Audio Units but I suspect Apple will abandon it. I drives me mad! :smiley:

    My comment was in relation to accepting new IAPs for IAA.

  • edited April 2022

    The one month notice and removal of the StoreKit stuff is completely bizarre. I can't quite figure out how Apple is going to pull this maneuver and try and justify the 15/30% for "hosting" your app and content. The documentation from Apple still has SKDownload as the preferred method and doesn't note the changes to anything. This has to be the worst move and communication from Apple I've seen yet.

    Speaking of poor communication from Apple, there hasn't been anything noted in the point releases to anything audio related since the first 15.1 betas that noted that audio units now had custom UI's. That one left me confused until I realized that they were only talking about their own audio units.

    I haven't seen any changes that I can remember. Maybe the audio group will actually have some useful sessions at the upcoming WWDC. I kinda have my doubts.

    AVFoundation is still in active development. They added several new components at last year's WWDC. It's mostly not focused on straight music creation, but that part is pretty solid at this point. AVAudioEngine and AVAudioSession are the way Apple wants you to use and configure audio unit graphs as far as I can tell. My perception of AVFoundation is that it's always been AU focused even before the AUv3 spec was exposed in iOS. I don't see how they can deprecate AVFoundation when it is pretty much the only way to get audio into iOS and work with it. Personally, I'd prefer if they took a bit of focus and control away from AVFoundation and let people get back into the HAL and CoreAudio directly.

    Maybe I'm missing what you were looking for in AVFoundation. Was there something you were hoping they were going to do with it that it doesn't do?

    Edit: Your post above came through as I was writing this. My take is that even on macOS, the core of writing processing code is supposed to be in an AU. It may never be exposed outside the graph. You can chain multiple of the in-built au's together to do lots of processing via the audio engine too.

  • edited April 2022

    Perhaps a bit off topic, but could you PLEASE make your apps available as AU's for macOS? I understand for most plugins on iOS, it's largely a matter of clicking a checkbox on your side so people can download to desktop to use in their DAW of choice (mine's GarageBand). I'd love to be able to use your sample-playing synths on multiple platforms.

  • edited April 2022

    @espiegel123 said: My comment was in relation to accepting new IAPs for IAA.

    Bit of confusion here then - sorry. The IAPs that I'm talking about in my app is entirely AU based (moodunits).

  • @NeonSilicon said: You can chain multiple of the in-built au's together to do lots of processing via the audio engine too.

    Indeed you can! One of my apps (touchscaper) is based entirely around that, including fx chains etc., etc. However...

    I see no way of (for example) exposing any part of the graph I've made for touchscaper as an AU. I know how to support IAA with a graph created with AVFoundation, but not AU.

    I guess that's kinda my point - it should be easy, but isn't. But hey, that's Apple tech stack for ya.

  • @NeuM said: I understand for most plugins on iOS, it's largely a matter of clicking a checkbox on your side

    Really sorry, I'd love that too, but...

    One the one hand, yes it is that simple. But, will the app(s) actually work OK on the desktop without fairly major work? In my case no I'm afraid and I've tested them all, or in some cases it just doesn't make sense to have desktop versions.

    In the case you mention, the sample playing synths (Waverleys), which I thought would have had the best chance of working, don't, because the Apple sampler I've used under the hood does not play nice at all on the desktop once you run more than a couple of instances. I have no idea why that is, but I do know there are quite a few issues with it on iOS which I had to workaround in my specific case.

    It's kinda nuts I can run 8 instances fine on GarageBand on an iPad, but I can't run more than 2 on a high end MacBook running Logic.

  • @moodscaper said:

    @NeuM said: I understand for most plugins on iOS, it's largely a matter of clicking a checkbox on your side

    Really sorry, I'd love that too, but...

    One the one hand, yes it is that simple. But, will the app(s) actually work OK on the desktop without fairly major work? In my case no I'm afraid and I've tested them all, or in some cases it just doesn't make sense to have desktop versions.

    In the case you mention, the sample playing synths (Waverleys), which I thought would have had the best chance of working, don't, because the Apple sampler I've used under the hood does not play nice at all on the desktop once you run more than a couple of instances. I have no idea why that is, but I do know there are quite a few issues with it on iOS which I had to workaround in my specific case.

    It's kinda nuts I can run 8 instances fine on GarageBand on an iPad, but I can't run more than 2 on a high end MacBook running Logic.

    I understand your concerns about quality control. Have you tested on M1-only chip Apple computers for compatibility? Because those developers whose iOS apps do work on M1 I have seen about a 75% success rate (among plugins I own) with their apps working successfully as Audio Units in GarageBand on macOS. I understand you could prevent your app from being backward compatible with the older Intel chip computers.

  • @moodscaper said:

    @NeonSilicon said: You can chain multiple of the in-built au's together to do lots of processing via the audio engine too.

    Indeed you can! One of my apps (touchscaper) is based entirely around that, including fx chains etc., etc. However...

    I see no way of (for example) exposing any part of the graph I've made for touchscaper as an AU. I know how to support IAA with a graph created with AVFoundation, but not AU.

    I guess that's kinda my point - it should be easy, but isn't. But hey, that's Apple tech stack for ya.

    I see. Yeah, I don't think there is any way to host an AU in an AU on iOS. It should be doable even if you had to make your own input and output and then run your own thread and then merge it back into the main AU buffers and thread. But, I couldn't figure out how to get it to work. I wanted to make a wrapper AU to expose all the builtin units but I gave up in frustration.

    @moodscaper said:

    @NeuM said: I understand for most plugins on iOS, it's largely a matter of clicking a checkbox on your side

    Really sorry, I'd love that too, but...

    One the one hand, yes it is that simple. But, will the app(s) actually work OK on the desktop without fairly major work? In my case no I'm afraid and I've tested them all, or in some cases it just doesn't make sense to have desktop versions.

    In the case you mention, the sample playing synths (Waverleys), which I thought would have had the best chance of working, don't, because the Apple sampler I've used under the hood does not play nice at all on the desktop once you run more than a couple of instances. I have no idea why that is, but I do know there are quite a few issues with it on iOS which I had to workaround in my specific case.

    It's kinda nuts I can run 8 instances fine on GarageBand on an iPad, but I can't run more than 2 on a high end MacBook running Logic.

    I tried using the checkbox to let my AU's work on M1 based Macs. That had multiple bugs that caused them to crash the DAW.

    Did you try using Catalyst to port your AU's? This has worked for mine with only some platform specific code to handle gestures and presets. But, I don't have any embedded AU's in mine, so I don't know how well this would work on macOS.

  • edited April 2022

    @NeonSilicon said:

    @moodscaper said:

    @NeonSilicon said: You can chain multiple of the in-built au's together to do lots of processing via the audio engine too.

    Indeed you can! One of my apps (touchscaper) is based entirely around that, including fx chains etc., etc. However...

    I see no way of (for example) exposing any part of the graph I've made for touchscaper as an AU. I know how to support IAA with a graph created with AVFoundation, but not AU.

    I guess that's kinda my point - it should be easy, but isn't. But hey, that's Apple tech stack for ya.

    I see. Yeah, I don't think there is any way to host an AU in an AU on iOS. It should be doable even if you had to make your own input and output and then run your own thread and then merge it back into the main AU buffers and thread. But, I couldn't figure out how to get it to work. I wanted to make a wrapper AU to expose all the builtin units but I gave up in frustration.

    @moodscaper said:

    @NeuM said: I understand for most plugins on iOS, it's largely a matter of clicking a checkbox on your side

    Really sorry, I'd love that too, but...

    One the one hand, yes it is that simple. But, will the app(s) actually work OK on the desktop without fairly major work? In my case no I'm afraid and I've tested them all, or in some cases it just doesn't make sense to have desktop versions.

    In the case you mention, the sample playing synths (Waverleys), which I thought would have had the best chance of working, don't, because the Apple sampler I've used under the hood does not play nice at all on the desktop once you run more than a couple of instances. I have no idea why that is, but I do know there are quite a few issues with it on iOS which I had to workaround in my specific case.

    It's kinda nuts I can run 8 instances fine on GarageBand on an iPad, but I can't run more than 2 on a high end MacBook running Logic.

    I tried using the checkbox to let my AU's work on M1 based Macs. That had multiple bugs that caused them to crash the DAW.

    Did you try using Catalyst to port your AU's? This has worked for mine with only some platform specific code to handle gestures and presets. But, I don't have any embedded AU's in mine, so I don't know how well this would work on macOS.

    Incidentally, @NeonSilicon I’m currently using several of your M1-compatible plugins (LRC5, GyroVibe, PhaseDelayArray, Spirangle, xTrem) on an iMac as AU’s in GarageBand with no crashing or other issues.

  • @NeonSilicon said: I wanted to make a wrapper AU to expose all the builtin units but I gave up in frustration.

    Funnily enough, that's pretty much what my moodunits (AU) app is. :smile:

    @NeuM - I tested on Intel and M1 - same issue(s) more or less. Sadly this is an Apple AU so beyond my control.

  • @moodscaper said:

    @NeonSilicon said: I wanted to make a wrapper AU to expose all the builtin units but I gave up in frustration.

    Funnily enough, that's pretty much what my moodunits (AU) app is. :smile:

    @NeuM - I tested on Intel and M1 - same issue(s) more or less. Sadly this is an Apple AU so beyond my control.

    So you have the system AU's running inside your moodunits AU or did you end up reimplementing the effects units yourself?

  • @NeuM said:
    [...]

    Incidentally, @NeonSilicon I’m currently using several of your M1-compatible plugins (LRC5, GyroVibe, PhaseDelayArray, Spirangle, xTrem) on an iMac as AU’s in GarageBand with no crashing or other issues.

    Cool! Nice to know that they are working. Thanks!

  • @NeonSilicon said: So you have the system AU's running inside your moodunits AU or did you end up reimplementing the effects units yourself?

    Kinda more the former, so my AU is just a lightweight wrapper for the Apple AU. So for example when the host asks for "my" render block, I give it the render block for the underlying Apple AU.

  • @moodscaper said:

    @NeonSilicon said: So you have the system AU's running inside your moodunits AU or did you end up reimplementing the effects units yourself?

    Kinda more the former, so my AU is just a lightweight wrapper for the Apple AU. So for example when the host asks for "my" render block, I give it the render block for the underlying Apple AU.

    Interesting. It's good to know that it works at least to some degree. Ideally it would be relatively easy to make a chain of effects that could be used this way. I wanted to be able to make it so that some of the internal AU's could be used for processing the output of my string resonance plugin in any DAW, but after fighting with it for a bit I figured it would be easier to port some of my version 2 audio units from OS X to do the job instead. So, I went that direction instead.

  • @NeonSilicon - yeah, funny looks like we maybe covered similar ground. As soon as I started trying to chain AUs and do something "graph-like" it all went horribly wrong. I came to a similar conclusion, and figured most folks (especially folks here) would just use AUM :smile:

  • Slightly related:
    AVAudioSequencer (and related structs) hasn't been updated since its release. It was for MIDI playback only with no fs given for generating a sequence in code. For that, you still have to use the ancient C toolkit API e.g. MusicTrack.

  • @GeneDeLisaDev said: with no fs given...

    I think you've summarised Apple's view on the further development of the audio side of AVFoundation quite nicely there @GeneDeLisaDev ! :blush:

Sign In or Register to comment.