Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

SynthJacker

1246711

Comments

  • @fattigman said:
    what is it you have planned for Korg users?

    Maybe not specifically for Korg users, but Inter-App Audio in general. Or a better way to put it would be ”everything that is not AU or external synth”.

    The general idea is to use the SynthJacker slicing engine to slice an audio file that was recorded outside of the app. You could generate a MIDI file based on the SJ sequence settings, then play it back with your IAA app and record it, then bring it to SJ for slicing and other post-processing.

  • @dendy yes that would work thank you.
    @coniferprod excellent idea. Sounds like an ”easy” and general solution. I know developers hate to hear when people say something is easy :wink:
    I will recommend SJ to the deluge community as it is becoming an essential tool for me.

  • Thanks for giving the Thor and the Thor people some love! Really appreciate it!!!

  • @coniferprod said:

    @fattigman said:
    what is it you have planned for Korg users?

    Maybe not specifically for Korg users, but Inter-App Audio in general. Or a better way to put it would be ”everything that is not AU or external synth”.

    The general idea is to use the SynthJacker slicing engine to slice an audio file that was recorded outside of the app. You could generate a MIDI file based on the SJ sequence settings, then play it back with your IAA app and record it, then bring it to SJ for slicing and other post-processing.

    When you think about, this is all you need to sample IAA apps. Most users have AUM which can host IAA apps and record the output. So in my opinon there’s no need for audiobus integration since the solution is so general that it will work with any app the can receive/play midi and record audio.

  • @fattigman said:
    When you think about, this is all you need to sample IAA apps. Most users have AUM which can host IAA apps and record the output. So in my opinon there’s no need for audiobus integration since the solution is so general that it will work with any app the can receive/play midi and record audio.

    Exactly – and that is why I don’t really see the need for full-on Audiobus support.

  • @coniferprod : I think supporting audio input via AB (which is all we are talking about) is way more straightforward than you think. It would also have freed you from needing to host AUs yourself.

    I am not sure that more people have AUM than Audiobus, btw.

  • Off topic : There was a recent poll where AUM dominated. It’s normal since AB3 was introduced much later and first versions where miles behind .
    Back on the subject, we don’t know if creating an AU host is easier than creating an AU plugin. That’s for the dev to decide.

  • wimwim
    edited July 2019

    A couple of thoughts

    • I believe (not as an expert obviously!) that IAA, and therefore AUM compatibility, comes as a side benefit of implementing Audiobus compatibility. So it’s not a question of either/or unless a developer decides to go IAA only.

      • Audiobus functioned before IAA was introduced, and will continue to function after IAA no longer does (which I firmly believe won’t be for quite some time). Meanwhile, host music apps will continue to need to work together.
      • Writing to a file and chopping it up is totally fine! That’s the best and simplest answer IMO. No need for AudioBus, IAA, or even internal AU hosting. All that is needed is to be able to get audio of any kind into SynthJacker. Realtime isn’t important in the least.
      • SynthJacker is Great!
  • Here's a real benefit to being able to sample inside AudioBus. I could load an IAA or AUv3 app and insert FX (after a synth to get EQ for example) and then pipe that signal into SynthJacker. With AUv3 loading I have to accept the output of the AUv3 without any additional controls... some of the AUv3's by the way do not even allow me much control of the app itself. AudioBus was/is and will be the best tool for piping audio and midi together on IOS. There are 100+ developers that got the message and coded to AudioBus as a defacto standard. AudioBus is typically required to get a complex setup to start/stop from from a single transport button.

    So... not the future? No. Trusting Apple to solve all problems is tricky. The fact the @coniferprod got AUv3 hosting to work is testament to his skills. It's not a trivial exercise given Apple's documentation and what the standard does. He had to code a host to get us this capability and it's worth every penny. But added AudioBus support as an FX app (meaning we feed it audio to be processed) instantly doubles if not triples it's value to me.

    But I do still want the longer sample time too. I just wanted to pipe in on the AudioBus is the essential standard most great apps support. The rush to AUv3 or nothing has been oversold here. There are apps that can only be glued together with AB3 like the @LuisMartinez Drummer's and iBassist for transport controls. Synthjacker could help us make loops from this apps but only with AB3 support to feed the MIDI and record the signal with some acceptable recording time. SynthJacker now offers the option to just save the whole recording so multi-minute recordings of anything could be made by just triggering drums and saving the recording. That's could be useful too.

    Anyway, I love what it does for AudioLayer and NS2 sample sets and the SFZ output also works for import into Auria Pro's Sampler. It's really great and dumping enhancement requests are what we do here. We are a swamp of "free advice". Take it for what it's worth.

  • wimwim
    edited July 2019

    @McD said:
    Here's a real benefit to being able to sample inside AudioBus. I could load an IAA or AUv3 app and insert FX (after a synth to get EQ for example) and then pipe that signal into SynthJacker. With AUv3 loading I have to accept the output of the AUv3 without any additional controls... some of the AUv3's by the way do not even allow me much control of the app itself. AudioBus was/is and will be the best tool for piping audio and midi together on IOS. There are 100+ developers that got the message and coded to AudioBus as a defacto standard. AudioBus is typically required to get a complex setup to start/stop from from a single transport button.

    But, you forget that AudioBus is an app. There's no reason the MIDI input from SynthJacker couldn't be sent to Audiobus Virtual Midi, and the resulting output recorded with AUM or AudioShare in the output slot. Or ... simpler ... just sent to AUM in the same way you would send it to any other app.

    I agree that Audiobus capability is a good thing to have, but don't see the compelling need for a one-time capturing solution like SynthJacker, that requires no live performance capability.

  • @espiegel123 said:
    I think supporting audio input via AB (which is all we are talking about) is way more straightforward than you think. It would also have freed you from needing to host AUs yourself.

    Audio input is not enough, because SynthJacker needs to be able to drive the app it is sampling, with MIDI output to synth. That amounts to hosting, doesn’t it? That was where I hit a wall with Audiobus on the first attempt.

    Since there probably won’t be too many new IAA apps, I think it’s better for me, with limited resources, to concentrate on the AU support. It also opens the path to adding an effects chain to the instrument AU. That hasn’t been asked too much, but it’s something that I would like to get opinions about.

  • @coniferprod said:

    @espiegel123 said:
    I think supporting audio input via AB (which is all we are talking about) is way more straightforward than you think. It would also have freed you from needing to host AUs yourself.

    Audio input is not enough, because SynthJacker needs to be able to drive the app it is sampling, with MIDI output to synth. That amounts to hosting, doesn’t it? That was where I hit a wall with Audiobus on the first attempt.

    Since there probably won’t be too many new IAA apps, I think it’s better for me, with limited resources, to concentrate on the AU support. It also opens the path to adding an effects chain to the instrument AU. That hasn’t been asked too much, but it’s something that I would like to get opinions about.

    But re: MIDI, your app already sends midi to external devices, right? You can just send midi to Audiobus' midi port. No need for you figure out hosting plugins if you support AB3 audio input as that would enable capable AU hosts like AUM and AB3 and maybe Ape matrix to send their audio chains to synthjacker as they can send their output to Audiobus outputs.

    My understanding from other devs is that that is a lot less effort than creating s robust AU host that can handle effects chains. Some apps that have tried to become AU hosts have really struggled to get it right.

  • @coniferprod said:

    @wim said:
    What is being requested is for the ability for SynthJacker to be hosted within Audiobus so that SynthJacker can accept audio input from Audiobus.

    I may have thought about it the wrong way around or something; the way I envisioned is I need SJ to drive the IAA instrument with MIDI and capture its output. Guess I could check it again and rethink.

    I have read the Audiobus SDK docs, it just did not seem to click with that use case, but I could be wrong!

    Obviously AUv3 is the Apple sanctioned way for instruments and effects, and while I have high respect for Audiobus, it’s far more likely than AU to take a hit if Apple changes something. The whole IAA deprecation thing does give the idea that Apple does care about long-term stability. Even AUv3 is only, what, 3-4 years old (yes there were v2 and v1 but no app extensions), and it’s only now starting to take off.

    I took a long technical look at Audiobus last spring, even up to integrating the SDK into SynthJacker, but maybe I just didn’t get it.

    I really appreciate that you took the time to write about this. I hope I can rethink it and see what Audiobus support would really require. That will take some time, of course.

    I think I can see the confusion, although I’m not a developer either..For it to work as is currently, it would need to be an IAA host, but it might work if it was implemented with Audiobus compatibility as both midi sender and audio receiver so you’d need multiple ports, one to drive the synth and the other Audiobus receiver (which goes in the output slot) to receive the audio.
    I’m guessing but that might be easier than making it a simple IAA host though :)

  • @wim said:
    But, you forget that AudioBus is an app. There's no reason the MIDI input from SynthJacker couldn't be sent to Audiobus Virtual Midi, and the resulting output recorded with AUM or AudioShare in the output slot. Or ... simpler ... just sent to AUM in the same way you would send it to any other app.

    This it true. But this just uses SynthJacker to config a MIDI sequence.

    On the back end it slices the recording, normalizes the resulting wave, aiff or CAF samples
    and places them in folders and uses name to simply loading into AudioLayer, NS2 and creates an SFZ file for loading those samplesets into Samplers that accept that format.

    So having it trigger MIDI notes AND be the recording target is why AudioBus support would allow us to use it with more flexibility on the input side and use it's code to slice, process and label the results. And just configure a run and 10 minutes later a huge piano
    instrument is done and ready to load up and play. Sometimes some hand tweaking is required but every update has solved the issues that required tweaking and the results are getting better and better. It does (in my experience) become unstable for runs exceeding 13 minutes. But when we give @coniferprod the setup that creates the issues he seems to add a fix in another update. The active users seems to have slowed up so he could be looking at new apps to created some income. But I think there are update that would spur more purchases with more discussion here.

  • wimwim
    edited July 2019

    @McD said:

    @wim said:
    But, you forget that AudioBus is an app. There's no reason the MIDI input from SynthJacker couldn't be sent to Audiobus Virtual Midi, and the resulting output recorded with AUM or AudioShare in the output slot. Or ... simpler ... just sent to AUM in the same way you would send it to any other app.

    This it true. But this just uses SynthJacker to config a MIDI sequence.

    On the back end it slices the recording, normalizes the resulting wave, aiff or CAF samples
    and places them in folders and uses name to simply loading into AudioLayer, NS2 and creates an SFZ file for loading those samplesets into Samplers that accept that format.

    So having it trigger MIDI notes AND be the recording target is why AudioBus support would allow us to use it with more flexibility on the input side and use it's code to slice, process and label the results. And just configure a run and 10 minutes later a huge piano
    instrument is done and ready to load up and play. Sometimes some hand tweaking is required but every update has solved the issues that required tweaking and the results are getting better and better. It does (in my experience) become unstable for runs exceeding 13 minutes. But when we give @coniferprod the setup that creates the issues he seems to add a fix in another update. The active users seems to have slowed up so he could be looking at new apps to created some income. But I think there are update that would spur more purchases with more discussion here.

    I don’t think you understand yet. The developer has said that he’s working on making it so that the app can send midi to apps, then import a file recorded by that app for the slicing and dicing part. So, SynthJacker could send MIDI to Audiobus, AUM, Gadget, or whatever, and then import the resulting wave file (recorded by whatever means) to complete the process.

    If I’ve understood that correctly, then I honestly can’t see what else is needed. Send the midi to AUM and record a stem at whatever point in the signal chain you like. Import it and your processed whatever can now be an instrument.

  • I am Looking into purchasing Synth Jacker for a doing just one specific thing.

    Sampling just one sound.

    I am a bit stupid with technology so please be kind. I had written a song with a sequence on a Korg M1 many years ago. I used a specific Combi Sound "MidiStack1" The M1 died a slow death. I was looking at buying another. Then the iM1 app came out and it sounds fantastic! I got my old sounds back, but no sequencer (I hated that old M1 sequencer!)

    I got NanoStudio2, great sequencer, great sounds! I cannot use iM1 in Nano Studio 2 :( I need that exact midistack1 sound for that one song as a sequence.

    My Question is:

    How do I use Synth Jacker to Faithfully sample the iM1 and then use that sound in Nano Studio 2?

  • @ralis said:
    How do I use Synth Jacker to Faithfully sample the iM1 and then use that sound in Nano Studio 2?

    The iM1 uses Inter-App Audio, but SynthJacker does not support IAA directly. But, if you have some audio hardware, it should be possible. This thread explains it better than I can:

    https://forum.audiob.us/discussion/32160/using-synthjacker-to-record-any-app-on-your-ipad-or-phone

    If you have specific questions about the method, it might be good to ask them in that thread, so that you will get more information in the right context.

    Also, to create samples that can be easily imported into Obsidian in Nanostudio 2 you will first need to set a couple of sample naming options in SynthJacker.

  • @coniferprod said:

    @ralis said:
    How do I use Synth Jacker to Faithfully sample the iM1 and then use that sound in Nano Studio 2?

    The iM1 uses Inter-App Audio, but SynthJacker does not support IAA directly. But, if you have some audio hardware, it should be possible. This thread explains it better than I can:

    https://forum.audiob.us/discussion/32160/using-synthjacker-to-record-any-app-on-your-ipad-or-phone

    If you have specific questions about the method, it might be good to ask them in that thread, so that you will get more information in the right context.

    Also, to create samples that can be easily imported into Obsidian in Nanostudio 2 you will first need to set a couple of sample naming options in SynthJacker.

    Went there, that made no sense to me :) Thanks anyway

  • Over at the NS2 forum @dendy made a great tutorial on how to do it. Check it out.

    @ralis said:

    @coniferprod said:

    @ralis said:
    How do I use Synth Jacker to Faithfully sample the iM1 and then use that sound in Nano Studio 2?

    The iM1 uses Inter-App Audio, but SynthJacker does not support IAA directly. But, if you have some audio hardware, it should be possible. This thread explains it better than I can:

    https://forum.audiob.us/discussion/32160/using-synthjacker-to-record-any-app-on-your-ipad-or-phone

    If you have specific questions about the method, it might be good to ask them in that thread, so that you will get more information in the right context.

    Also, to create samples that can be easily imported into Obsidian in Nanostudio 2 you will first need to set a couple of sample naming options in SynthJacker.

    Went there, that made no sense to me :) Thanks anyway

  • @wim said:
    I don’t think you understand yet. The developer has said that he’s working on making it so that the app can send midi to apps, then import a file recorded by that app for the slicing and dicing part. So, SynthJacker could send MIDI to Audiobus, AUM, Gadget, or whatever, and then import the resulting wave file (recorded by whatever means) to complete the process.

    @coniferprod - is this summary accurate? I typically have to route midi out to my audio interface and back into the iPad to route midi to anything other than a loaded AUv3 app.
    Are you intending to allow the routing of midi to local apps?

    Then are you going to allow importing files for slicing into what I think of as "note" files that are creating by detecting the silence between samples and then slicing with the nose threshold crossing.

    I've never seen this plan in our discussions. But it would open the door to a lot of new flexible use cases.

  • Not internal routing, but instead generating a MIDI file with SynthJacker, then playing it back with the app you want to sample (if it can’t play MIDI files, get a MIDI player and hook it up with AUM – details pending), then record the output. Then import the audio file to SynthJacker for slicing and post-processing.

    That scenario is the lowest hanging fruit. Again, I have nothing against Audiobus, but to use it would mean going all in, and I have also other related scenarios in mind apart from IAA - for example, this method would allow easy autosampling of desktop VST/AU, maybe using iCloud or Dropbox to transfer files.

  • @coniferprod said:
    Not internal routing, but instead generating a MIDI file with SynthJacker, then playing it back with the app you want to sample (if it can’t play MIDI files, get a MIDI player and hook it up with AUM – details pending), then record the output. Then import the audio file to SynthJacker for slicing and post-processing.

    That scenario is the lowest hanging fruit. Again, I have nothing against Audiobus, but to use it would mean going all in, and I have also other related scenarios in mind apart from IAA - for example, this method would allow easy autosampling of desktop VST/AU, maybe using iCloud or Dropbox to transfer files.

    I'd be happy as a clam with this option. Simple and effective. Not to mention you could reuse the midi file with all kinds of different setups.

  • @coniferprod please at least consider adding a virtual MIDI port to the app, so that Synthjacker can send MIDI directly to iOS synths without having to save out a MIDI file. That way we could select Synthjacker's virtual MIDI port as an input in AUM and then record the audio from there. It would save a lot of hassle.

  • Same here great idea!

    So this means that SynthJacker in a sense will have a midi port out right? Since it will be sending the midi file playback into AUM and we just record the output files which in turn SynthJacker will process. I really like this idea! Thanks so much!

    @wim said:

    @coniferprod said:
    Not internal routing, but instead generating a MIDI file with SynthJacker, then playing it back with the app you want to sample (if it can’t play MIDI files, get a MIDI player and hook it up with AUM – details pending), then record the output. Then import the audio file to SynthJacker for slicing and post-processing.

    That scenario is the lowest hanging fruit. Again, I have nothing against Audiobus, but to use it would mean going all in, and I have also other related scenarios in mind apart from IAA - for example, this method would allow easy autosampling of desktop VST/AU, maybe using iCloud or Dropbox to transfer files.

    I'd be happy as a clam with this option. Simple and effective. Not to mention you could reuse the midi file with all kinds of different setups.

  • @richardyot said:
    @coniferprod please at least consider adding a virtual MIDI port to the app, so that Synthjacker can send MIDI directly to iOS synths without having to save out a MIDI file. That way we could select Synthjacker's virtual MIDI port as an input in AUM and then record the audio from there. It would save a lot of hassle.

    +1

  • McDMcD
    edited July 2019

    ENHANCEMENT REQUEST LIST:

    1. Add virtual MIDI
    2. Add save the setup for recalls (needed when slicing audio produced externally)
    3. Add audio file import for slicing into naming and folder schemes required by NS2, AudioLayer, etc.

    The group mind has ID'ed an ideal workflow with the least effort using these features 1-3.

    1. AudioBus support
    2. IAA app hosting
  • @McD said:
    The group mind has ID'ed an ideal workflow with the least effort using these features.

    1. AudioBus support
    2. IAA app hosting

    Least effort for users, not the developer.
    Option 5 isn’t necessary if option 4 is implemented.

  • @wim said:

    @McD said:
    The group mind has ID'ed an ideal workflow with the least effort using these features.

    My text was ambiguous (I have edited it). I intended the message that 1-3 seems to be the "ideal workflow" for many.

    I expect adding AudioBus and IAA to be the heavy lifting for the developer.

    The key with synthjacker to get any new features is to show that a lot of new purchases will follow.
    It appears the developer was working on another product and might have been ready to let SynthJacker
    just trickle in new sales. But threads like these encourage updates with new use cases supported.

    Several of us have been prodding him to keep the updates coming and he's been delivering new features
    at a steady clip. If 1-3 are easy he'll probably push them out. 4-5 are probably dead on arrival unless
    dozens (or maybe more) declare it would be a must buy.

    I keep suggesting features and proposing a price increase or maybe IAP options to get some sales for the effort.

    On the subject of waiting for updates... I was really hoping @Virsyn would ship SF2 or SFZ loading since I have got all the ESX24's off my Mac and want more sample sets. SynthJacker is my fallback and NS2 looks like a much better option to avoid the headaches of AUv3's in Cubasis or any other DAW.

  • New SynthJacker demo video: how to autosample AUv3 instrument Audio Units on iOS

Sign In or Register to comment.