Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Nanostudio 2 update

1212224262733

Comments

  • No Audiobus support means I won’t buy it. No use for it then...
    Really too bad, as this was gonna be my Xmas present 😞 🌲

  • I’m more curious than excited. I already have so many ways of working from the many DAWs and similar on iOS - problem is none of them quite cover all the bases I would like. I will no doubt buy NS2, but likely it will be another ‘nearly there’ to add to the collection.

    Don’t get me wrong, I’m glad some are very excited for NS2 :)

  • @lovadamusic said:

    @anickt said:

    @lovadamusic said:

    @kinkujin said:
    I think this thread, and the next one about the next long-awaited app (eg. Drambo?) is just the thing we do. We need this 'thing' to represent our dreams and aspirations. I mean, this is how I get through the day. Life can be very boring and redundant and boring ... so we come here and get all amped up about the new something because it is something we can actually get excited about.

    So us passionate music-types, get all invested in this non-tangible thing until it's released. Then when it lands we will just pummel it because it didn't live up to the hopes we had ascribed to it - realistic or not. I'm guilty of this myself, more than once.

    In the case of NanoStudio, the dev seems to be a perfectionist and doesn't seem to make promises he can't keep. But, I'm sure I'll (we'll) find something to be disappointed about in it. I'd hate to be a developer. So much psychology.

    At the risk of being a downer, if anyone today, with all the affordable music-making capability we have, is sitting around bored waiting for the new thing that's going to make their dreams come true, it's never going to happen. Look for a new dream or a new hobby. On the contrary---passionate music-types should be wishing they had more hours in the day to take advantage of what they already have.

    I doubt if (for most people on this forum) anyone is “sitting around bored waiting for the new thing...”. For me anyway, it’s a matter of efficiency and getting the most done in the hours (or minutes) of the day that are available for music making. There are apps that contribute to that goal and those that don’t. Based on experience with NS1, NS2 will be one of those apps.

    "We need this 'thing' to represent our dreams and aspirations. I mean, this is how I get through the day. Life can be very boring and redundant and boring ... so we come here and get all amped up about the new something because it is something we can actually get excited about."

    If the shoe don't fit... ?

    Musicians I've known have always made music with whatever they can get their hands on, and it's never been boring. Seems today, for some "passionate music types," it's more about the technology.

    btw I’m not the least bit bored with the current state of music production. I use these new apps to motivate me to get through what can be at times a dull, dreary routine.

  • I am mostly excited for this synth he packed into it. I heard some amazing samples from that which are right at spot for my taste. So I am pretty much happy to finally see some videos, trailers, more sounds soon....

  • Let‘s hope Brice makes a preset bank for it as well :)

  • @Cib said:
    Let‘s hope Brice makes a preset bank for it as well :)

    Hear, hear!

  • @JohnnyGoodyear said:

    @Cib said:
    Let‘s hope Brice makes a preset bank for it as well :)

    Hear, hear!

    He makes the kind of presets i would like to do myself but have no time or just fail to do as good.
    But i would be into a preset bank as well with custom made samples.
    I´m pretty sure Obsidian will be my favorite iOS synth ever. I already know this from what Eden can do and what will be added. I guess it won´t be as good in monster analog tunes like a Model D but here you can import multi-samples to add that flavor as well.
    It´s the only app right now for which i would buy an iPad just to use this one app.

  • @Kühl said:
    No Audiobus support means I won’t buy it. No use for it then...
    Really too bad, as this was gonna be my Xmas present 😞 🌲

    How can you have no use for it? You make the whole track within the one app.

  • Re: IAA & AB support
    NS2 does not have support for either of these. Based on what Matt has posted and a bit of history, I am surmising the following:

    Matt was burned by changes in iOS that added a lot of extra NS1 app maintainance, which of course he isn’t paid for. He was really burned when iOS 9 ( I believe) broke AB functionality for NS1. On the NS1 forum we had a huge Wish List thread of features we wanted and iOS was changing from 32 bit to 64 bit in the future so he had to basically start all over from scratch and build a new 64 bit app. We all know Apple creates some pretty crap code, so IAA creates headaches right off the bat and based on past experience, changes in iOS will forces changes to AB code as well. For a HUGE app like NS2 (which is coded just by one person), one needs to simplify where one can.

    NS1 was concieved as an all-in-the-box scratchpad for ideas that would later be moved to a PC/Mac DAW to be completed. The users like NS1 too much and wanted Matt to make NS2 a more complete DAW. We don’t need no stinkin’ PC/Macs. Matt still prefers the all-in-the-box style of app because the user doesn’t have to mess around with getting other apps to work. NS2 is focused around efficient workflow, and doing everything within the app. He compromised with user requests to make NS1 AB compatible, but with the new app it looks like he was trying to avoid the extra headache and time-consuming changes he would potentially have to make to the app to maintain functionality with every stupid change Apple made to iOS.

    Using the PC/Mac model of VST/AU, Matt decided AUv3 would be the future of iOS platform as well. Because he has no control over iOS, nor the changes that othe devs make to their apps, this makes sense business-wise to keep Blip Interactive viable. We may not like it, but as we grouse about lack of features we should keep in mind that this is a guy who needs to feed his family. We should respect his decisions on choices like this. It also means that instead of time-consuming maintainance, he can focus on new features (iPhone, Audio Tracks, etc.).

    As I see it, all the funtionality of AB is still viable. Yes, there is an extra step or two to getting the Audio into NS2, but I don’t find that to be too difficult. I’ve been doing that with NS1 since AB compatibility was broken by iOS 9. People who are less patient with this may find another DAW preferable. If you have other opinions that are so strong you can’t adjust your AB workflow, I understand and do not question your right to do so.

    For me the UI/UX aspects of other DAW are more of a hindrince than this one feature being native to NS2. It’s faster to have AB work-arounds than use other DAW apps. For me. I know this is a personal workflow issue and I’m not slamming anyone who feels differently. I’m just sharing my opinions about what works for me in the hope that it will help someone else.

    PS: I may be totally wrong about the reasons Matt chose re: IAA/AB support. My recollections are fuzzy and I hope I haven’t implied something erroneous by mis-remembering his posts on the NS1 forum.

    Note: Devs are wildly divergent in how they implement the AUv3 standard. This is still a bit of the Wild West, and AUv3 cannot be expected to be perfect just yet. It will take some time for standardized standards to be implemented across the majority of apps so that the host app has no problems. The more we as iOS musician support this model, the better it will become. I hope. 😬

  • @Slam_Cut said:
    Re: IAA & AB support
    NS2 does not have support for either of these. Based on what Matt has posted and a bit of history, I am surmising the following:

    Matt was burned by changes in iOS that added a lot of extra NS1 app maintainance, which of course he isn’t paid for. He was really burned when iOS 9 ( I believe) broke AB functionality for NS1. On the NS1 forum we had a huge Wish List thread of features we wanted and iOS was changing from 32 bit to 64 bit in the future so he had to basically start all over from scratch and build a new 64 bit app. We all know Apple creates some pretty crap code, so IAA creates headaches right off the bat and based on past experience, changes in iOS will forces changes to AB code as well. For a HUGE app like NS2 (which is coded just by one person), one needs to simplify where one can.

    NS1 was concieved as an all-in-the-box scratchpad for ideas that would later be moved to a PC/Mac DAW to be completed. The users like NS1 too much and wanted Matt to make NS2 a more complete DAW. We don’t need no stinkin’ PC/Macs. Matt still prefers the all-in-the-box style of app because the user doesn’t have to mess around with getting other apps to work. NS2 is focused around efficient workflow, and doing everything within the app. He compromised with user requests to make NS1 AB compatible, but with the new app it looks like he was trying to avoid the extra headache and time-consuming changes he would potentially have to make to the app to maintain functionality with every stupid change Apple made to iOS.

    Using the PC/Mac model of VST/AU, Matt decided AUv3 would be the future of iOS platform as well. Because he has no control over iOS, nor the changes that othe devs make to their apps, this makes sense business-wise to keep Blip Interactive viable. We may not like it, but as we grouse about lack of features we should keep in mind that this is a guy who needs to feed his family. We should respect his decisions on choices like this. It also means that instead of time-consuming maintainance, he can focus on new features (iPhone, Audio Tracks, etc.).

    As I see it, all the funtionality of AB is still viable. Yes, there is an extra step or two to getting the Audio into NS2, but I don’t find that to be too difficult. I’ve been doing that with NS1 since AB compatibility was broken by iOS 9. People who are less patient with this may find another DAW preferable. If you have other opinions that are so strong you can’t adjust your AB workflow, I understand and do not question your right to do so.

    For me the UI/UX aspects of other DAW are more of a hindrince than this one feature being native to NS2. It’s faster to have AB work-arounds than use other DAW apps. For me. I know this is a personal workflow issue and I’m not slamming anyone who feels differently. I’m just sharing my opinions about what works for me in the hope that it will help someone else.

    PS: I may be totally wrong about the reasons Matt chose re: IAA/AB support. My recollections are fuzzy and I hope I haven’t implied something erroneous by mis-remembering his posts on the NS1 forum.

    Note: Devs are wildly divergent in how they implement the AUv3 standard. This is still a bit of the Wild West, and AUv3 cannot be expected to be perfect just yet. It will take some time for standardized standards to be implemented across the majority of apps so that the host app has no problems. The more we as iOS musician support this model, the better it will become. I hope. 😬

    Nice post. Didn’t knew about the lack of IAA in Ns2.

    Anyways. I wanted to comment on the part regarding Apple’s code and documentation. Right now. It’s crap. Regarding AU.

    They have a wonderful node based framework called AVAudioEngine, which I use for Samplist. That engine is very capable of real-time audio. But it’s a real bitch to implement it into an AU.

    They’re basically assuming any dev is an old dog accustomed to C and assembler code and how CPUs work, thus creating like a walled garden making it very hard for Swift developers to get into. I’m really pissed and frustrated about this.

    I managed to make something work. Like play a loop into an AU but except AUM it behaves very wrong in all the other hosts. Including their own Garage band.

    Anyway I wanted to blow some steam off.
    Hats of for Matt from Blip. It’s a huge accomplishment for one guy to build that monster app. Let’s all think of that before we start our usual bitching like “it doesn’t have this or it doesn’t do that”. I’m very pleased with how it turned out and more important I’m pleased it’s finally here.

  • I think IAA/AB support topic is quite simple.. no need for big emotions

    A/ is your workflow constantly switching between trillions of apps and screens, do you enjoy to manage all that inter app routings, is it inspiring for you ? NS2 is not DAW of your choice. There are other podibilities, sufficiently advanced.

    B/ Are you searching for app where you can do everything "in the box" in pro sound quality (like Reason was before adding extensions and vst, or Gadget) - NS2 is right choice.

    It's always just about priorities and personal needs..

    Looking forward for all that great tunes which will people do with NS2 ! Can't wait, next year will be huge ride !

  • edited November 2018

    This has just been posted over in the NS forum which gives the whole IAA/AU3 thing direct from the developer:

    I'll be brutal.

    IAA's an old technology which was never thought through properly in the first place. It doesn't support multiple instances and the workflow's awful from the user's point of view, aside from all its other technical issues.

    AUv3 is so obviously the way it should have been done in the first place. It has support for multiple instances, everything automatically loads with the host's project in the same way as you saved it, and it's easy to perform non-realtime mixdowns within the host.

    Desktop users have had this for years and it's obviously the right way to do it.

    Any plugin developer still holding onto IAA is going in the wrong direction - as I'm sure they themselves know. If they don't have an AU version then it's simply because they probably don't have the time or resources to update it. They wouldn't hold onto it for any other reason.

    Load up a host project, every plugin restored with the same settings as when it was saved, multiple instances, non-realtime mixdown, no messing around paging between different apps or loading them in a precise order to make sure it works.

    It's the future. No brainer.

    Obviously not going to make everyone happy :smile:

  • I agree with your post dendy, except that you left out option C: use both A & B. Like AB. Only different. 😊

  • @Cib said:
    It´s the only app right now for which i would buy an iPad just to use this one app.

    I agree with you. I just ordered new iPad Pro. I’m in deep now. What the hell. You only live once.

  • Btw as @Slam_Cut mentioned there be still way to get any IAA compatible app sound into NS.. and not that complicated..

    • run AudioShare, load IAA synth to it
    • in NS2 there will be "midi out instrument" - point this instrument to IAA synth
    • play notes NS, record audio to Audioshare
    • open recorded audio from audioshare in NS, put it into pad of build in sampler

    to me, this workflow is basically similiar which i used always for recording external hw synths, just with difference you record external synth within you device.. all elements needed for this workflow (midi out instrument, import from audioshare) were confirmed by matt

  • @Slam_Cut Many thanks for your post!

    I can understand the stance towards IAA.

    But is this the same as for AudioBus?
    Can there be no AB support when there is no IAA support?
    @Michael

    I want to save and restart sessions with AudioBus, and NS2 should be a part of that!

    As much as I anticipated NS2, I wanted to add it, not to restrict myself to it.

  • @kitejan said:
    This has just been posted over in the NS forum which gives the whole IAA/AU3 thing direct from the developer:

    I'll be brutal.

    IAA's an old technology which was never thought through properly in the first place. It doesn't support multiple instances and the workflow's awful from the user's point of view, aside from all its other technical issues.

    AUv3 is so obviously the way it should have been done in the first place. It has support for multiple instances, everything automatically loads with the host's project in the same way as you saved it, and it's easy to perform non-realtime mixdowns within the host.

    Desktop users have had this for years and it's obviously the right way to do it.

    Any plugin developer still holding onto IAA is going in the wrong direction - as I'm sure they themselves know. If they don't have an AU version then it's simply because they probably don't have the time or resources to update it. They wouldn't hold onto it for any other reason.

    Load up a host project, every plugin restored with the same settings as when it was saved, multiple instances, non-realtime mixdown, no messing around paging between different apps or loading them in a precise order to make sure it works.

    It's the future. No brainer.

    Obviously not going to make everyone happy :smile:

    somebody send the Propellerheads crew this message...

  • I do feel NS2 will comply with my two 'musts' AUv3 and Files.app support.
    (ie. access the entire NS2 folder-structure using Files.app).

    Most of the time I use audio-tracks for sampling & re-sampling so with good export options the lack of dedicated audio tracks is a non-issue for me. Export a track/pattern to audio and import to the sampler.

    As long as the sampling/recording can be synced to the transports start & stop it will be quite easy to record the 'audio-tracks' to the sampler and trigger them from the time-line.

    Even though this is the AB forum I seldom actually use AB as I see no real need for it.

  • edited November 2018

    There is one important point of view - if developers of DAWs would be more opened to DROP IAA/AB support, developers of synths / fx will be lot more pushed to ADD AUv3 support - which will be big benefit to all of us.

    So from this point of view NS2 is first nice example of forward-thinking. I hope as time will go, other developers will also follow this direction.

  • @Samu said:
    I do feel NS2 will comply with my two 'musts' AUv3 and Files.app support.
    (ie. access the entire NS2 folder-structure using Files.app).

    Most of the time I use audio-tracks for sampling & re-sampling so with good export options the lack of dedicated audio tracks is a non-issue for me. Export a track/pattern to audio and import to the sampler.

    As long as the sampling/recording can be synced to the transports start & stop it will be quite easy to record the 'audio-tracks' to the sampler and trigger them from the time-line.

    Even though this is the AB forum I seldom actually use AB as I see no real need for it.

    For me to have to export audio into something else then import back into NS2 sampler is just too troublesome. Will wait for audio tracks to be added next summer 2019.

  • I never had ns1 and now it's off the app store as everyone knows, wahhh but I'm stoked for ns2 cuz it's going to be brand new to me!

  • @[Deleted User] said:

    For me to have to export audio into something else then import back into NS2 sampler is just too troublesome. Will wait for audio tracks to be added next summer 2019.

    The way it works in Renoise(which lacks dedicated audio-tracks) is that the 'sampler' can record in sync while the project is playing. Meaning 'audio' can be recorded directly into the sampler, no separate export/import is needed.

    After the recording is done a 'trigger note' can be added time-line and the same audio-snippet can also be used as an instrument if needed.

    Depending on how NS2 implements sampling this could work.
    But if there is a mandatory 'rename after recording' step in NS2 it will get super annoying...
    (A mandatory 'rename after recording' is something that kills my sampling flow completely).

  • @Samu did you considered option that with such powerfull but still extremely efficient build in synth like Obsidian, you will maybe not need to resample at all ? :-) Just different point of view to think about :-)

  • @alecsbuga said:
    Nice post. Didn’t knew about the lack of IAA in Ns2.

    Anyways. I wanted to comment on the part regarding Apple’s code and documentation. Right now. It’s crap. Regarding AU.

    They have a wonderful node based framework called AVAudioEngine, which I use for Samplist. That engine is very capable of real-time audio. But it’s a real bitch to implement it into an AU.

    They’re basically assuming any dev is an old dog accustomed to C and assembler code and how CPUs work, thus creating like a walled garden making it very hard for Swift developers to get into. I’m really pissed and frustrated about this.

    I managed to make something work. Like play a loop into an AU but except AUM it behaves very wrong in all the other hosts. Including their own Garage band.

    Anyway I wanted to blow some steam off.
    Hats of for Matt from Blip. It’s a huge accomplishment for one guy to build that monster app. Let’s all think of that before we start our usual bitching like “it doesn’t have this or it doesn’t do that”. I’m very pleased with how it turned out and more important I’m pleased it’s finally here.

    Very fair.

  • @Samu said:

    @[Deleted User] said:

    For me to have to export audio into something else then import back into NS2 sampler is just too troublesome. Will wait for audio tracks to be added next summer 2019.

    The way it works in Renoise(which lacks dedicated audio-tracks) is that the 'sampler' can record in sync while the project is playing. Meaning 'audio' can be recorded directly into the sampler, no separate export/import is needed.

    After the recording is done a 'trigger note' can be added time-line and the same audio-snippet can also be used as an instrument if needed.

    Depending on how NS2 implements sampling this could work.
    But if there is a mandatory 'rename after recording' step in NS2 it will get super annoying...
    (A mandatory 'rename after recording' is something that kills my sampling flow completely).

    Didnt know renoise worked that way, interesting. No indication so far that NS2 can do the same. Nice if someone can confirm?

  • @dendy said:
    @Samu did you considered option that with such powerfull but still extremely efficient build in synth like Obsidian, you will maybe not need to resample at all ? :-) Just different point of view to think about :-)

    Yeah, that has crossed my mind and I will be giving it a deep dive for sure :)

    The first thing I will check are the LFOs and if they contain any traces of custom waveforms that can be used as 'step sequencers' for modulating parameters...

  • edited November 2018

    @Samu said:

    @dendy said:
    @Samu did you considered option that with such powerfull but still extremely efficient build in synth like Obsidian, you will maybe not need to resample at all ? :-) Just different point of view to think about :-)

    Yeah, that has crossed my mind and I will be giving it a deep dive for sure :)

    The first thing I will check are the LFOs and if they contain any traces of custom waveforms that can be used as 'step sequencers' for modulating parameters...

    MSEG‘s would be great. I always wonder why not more iOS synths have them. Virsyn comes to mind but are there any more?
    Envelopes are where it can be very creative and complex and here most iOS synths could be much more creative to get out of static sounds.
    But Mitosynth is king of LFO‘s in iOS for me.
    I also wonder if there is nothing like midi shaper (Cableguys) on iOS. Since you can use AUv3 midi and save/recall it as part of a synth i just add missing features via third party apps.

  • @alecsbuga
    Hats of for Matt from Blip. It’s a huge accomplishment for one guy to build that monster app

    And consider he is "old dog accustomed to C and assembler code " cause NS is written in C++ with sensiteve DSP parts of code in asm ;-)

  • @Cib said:

    MSEG‘s would be great. I always wonder why not more iOS synths have them. Virsyn comes to mind but are there any more?

    The FENV's (Flexible Envelope) in Quanta are nice. SquareSynth 2 has nice 'tables' that can be used to modulate some parameters. iSEM has an 4 8-step sequencers that ca be routed to almost any parameter.
    BM3 has dedicated step-sequencers that can be added to most parameters on sample-based instruments.

    I'm still thinking that a custom LFO-wave (with for example 4-64 steps) with optional interpolation between steps(for stepped and smooth mode) and output quantise(ie. match the table y-axis step values to for example semi-tones) could cover a lot of ground especially if the speed is kept high (for example 1-20ms per step).

    Alchemy in Logic Pro X has very nice modulation options too and it's kinda frustrating knowing that the 'engine' is already in there in iOS GarageBand but no way to do deep patch editing arghhhhhhhh..... :D

  • @Samu said:

    @Cib said:

    MSEG‘s would be great. I always wonder why not more iOS synths have them. Virsyn comes to mind but are there any more?

    The FENV's (Flexible Envelope) in Quanta are nice. SquareSynth 2 has nice 'tables' that can be used to modulate some parameters. iSEM has an 4 8-step sequencers that ca be routed to almost any parameter.
    BM3 has dedicated step-sequencers that can be added to most parameters on sample-based instruments.

    I'm still thinking that a custom LFO-wave (with for example 4-64 steps) with optional interpolation between steps(for stepped and smooth mode) and output quantise(ie. match the table y-axis step values to for example semi-tones) could cover a lot of ground especially if the speed is kept high (for example 1-20ms per step).

    Alchemy in Logic Pro X has very nice modulation options too and it's kinda frustrating knowing that the 'engine' is already in there in iOS GarageBand but no way to do deep patch editing arghhhhhhhh..... :D

    Yes, Alchemy is a mod monster. My favorite synth from all. Indeed we have the engine running in iOS but with a player only to edit sadly.
    There are more flexible options in Omnisphere and Falcon but i just love Alchemy the most.
    Add MPE to it as well :)

This discussion has been closed.