Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Use SonoBus to expose DAW tracks to AUM

SonoBus is an app in testing (https://forum.audiob.us/discussion/41804/sonobus) which lets you collaborate on songs with other people. Of more interest to me is the direct connection mode which basically lets you pipe audio between two local instances of the app regardless of where they are hosted. This means you can have an instance of it in one host and bus the audio to an instance in another host. I had hoped the ApeMatrix send/receive plugin did something like this but it only worked between AUs in a host sandbox.

For me this opens some interesting possibilities. For example, I was able to effectively control multiple instances of the great NS2 Obsidian synth in AUM by doing the following:

  1. Load NS2 as IAA in AUM
  2. Add an Obsidian track
  3. Load an instance of SonoBus as an AUFX on that track
  4. Have the track listen on the NS2 MIDI channel 1, remembering to select "Track receives MIDI: Always"
  5. Load SonoBus in to an "INSERT/EFFECTS" slot in AUM
  6. Open the SonoBus inside NS2
  7. Click the "Connect" button inside SonoBus and go to Connect->Direct
  8. Note the port number at the bottom of the screen. It is the number after the colon in "Local Address: a.b.c.d : portnumber"
  9. Go to the instance of SonoBus in AUM and again click the "Connect" button inside SonoBus and go to Connect->Direct
  10. Type "127.0.0.1:portnumber", e.g. "127.0.0.1:55493" and then click the "Direct Connect" button. NOTE: I tried with my network assigned IP (192.x.y.z) and this did not work but the local loopback address did, maybe some firewall issue.
  11. If all goes well then the two instances should now be talking to each other. Any sounds that you make from inside NS2 on that Obsidian track should now be routed to the AUM lane containing SonoBus.
  12. You should now be able to control Obsidian by pointing the keyboard in AUM to channel 1 of NS2.
  13. Repeat these steps for each instance of Obsidian you want to control, changing the MIDI input channel for each new track.

By default I think the instances are bi-directional. You can turn off the sending side to save some resource in the AUM instances by clicking on "Send Quality->disable sending".

The default send sound quality is 96k and you may want to change this to a higher number. The default can be changed in the main menu or in the sending instance by clicking on "Send Quality".

Resource usage seemed pretty low and I was able to control 7 Obsidians and an iBassist in separate AUM lanes via Fugue Machine without any obvious crackles or syncing issues. This was using PCM 24-bit audio as well.

I did not notice any particular latency issues either when playing the synth via the AUM keyboard but then I am not particularly sensitive to such things :)

@sonosaurus this app looks very useful!

«134

Comments

  • edited January 10

    This looks sweet ! I've been following this closely, and this looks like the perfect tool to use several idevices.
    Looks like using the uncompressed pcm format is recommended, so that it doesn't generate latency from compression-deconpression. Loopback also adds latency but what a great tool to jam with fellows in the same neighborhood !
    I ordered several ethernet adapters (as it's recommended on their site), so that can be the perfect replacement for studiomux ! Will report back !

  • edited January 10

    I imagine there are a number of ways this could be a useful tool. I don't have GarageBand installed to test but I expect that you could use this to get live audio out of individual GB tracks or instruments as well.

  • I was looking for a way to use it in AUM - never thought to look for it as an effect ..! Bloody amazing - thanks a million...👍

  • edited January 10

    This is cool! Some audio dropouts for me over WiFi, but nice way to play around with stuff from the laptop in aum 👍👍👍@MisplacedDevelopment and thanks to @enkaytee for pointing it out to me after seeing it here 👍👍👍

  • @MisplacedDevelopment said:
    SonoBus is an app in testing (https://forum.audiob.us/discussion/41804/sonobus) which lets you collaborate on songs with other people. Of more interest to me is the direct connection mode which basically lets you pipe audio between two local instances of the app regardless of where they are hosted. This means you can have an instance of it in one host and bus the audio to an instance in another host. I had hoped the ApeMatrix send/receive plugin did something like this but it only worked between AUs in a host sandbox.

    For me this opens some interesting possibilities. For example, I was able to effectively control multiple instances of the great NS2 Obsidian synth in AUM by doing the following:

    1. Load NS2 as IAA in AUM
    2. Add an Obsidian track
    3. Load an instance of SonoBus as an AUFX on that track
    4. Have the track listen on the NS2 MIDI channel 1, remembering to select "Track receives MIDI: Always"
    5. Load SonoBus in to an "INSERT/EFFECTS" slot in AUM
    6. Open the SonoBus inside NS2
    7. Click the "Connect" button inside SonoBus and go to Connect->Direct
    8. Note the port number at the bottom of the screen. It is the number after the colon in "Local Address: a.b.c.d : portnumber"
    9. Go to the instance of SonoBus in AUM and again click the "Connect" button inside SonoBus and go to Connect->Direct
    10. Type "127.0.0.1:portnumber", e.g. "127.0.0.1:55493" and then click the "Direct Connect" button. NOTE: I tried with my network assigned IP (192.x.y.z) and this did not work but the local loopback address did, maybe some firewall issue.
    11. If all goes well then the two instances should now be talking to each other. Any sounds that you make from inside NS2 on that Obsidian track should now be routed to the AUM lane containing SonoBus.
    12. You should now be able to control Obsidian by pointing the keyboard in AUM to channel 1 of NS2.
    13. Repeat these steps for each instance of Obsidian you want to control, changing the MIDI input channel for each new track.

    By default I think the instances are bi-directional. You can turn off the sending side to save some resource in the AUM instances by clicking on "Send Quality->disable sending".

    The default send sound quality is 96k and you may want to change this to a higher number. The default can be changed in the main menu or in the sending instance by clicking on "Send Quality".

    Resource usage seemed pretty low and I was able to control 7 Obsidians and an iBassist in separate AUM lanes via Fugue Machine without any obvious crackles or syncing issues. This was using PCM 24-bit audio as well.

    I did not notice any particular latency issues either when playing the synth via the AUM keyboard but then I am not particularly sensitive to such things :)

    @sonosaurus this app looks very useful!

    A couple of things. SonoBus is very cool and — if you are using wi-fi and want full fidelity audio you are likely to experience a combination of dropouts and/or latency. With wi-fi dropouts are not the fault of SonoBus...it is the nature of wi-fi and how network protocols work. It is a very YMMV depending on your router and traffic on your local network.

  • Yes, for sure

  • edited January 26

    Just to confirm, I tested this technique in GarageBand and it worked. I was able to control the GB bass instrument using KB1 in AUM and capture the output in an AUM channel with no noticeable latency, all on the same device. The procedure was the same as before : add SonoBus as an effect to the instrument track and it captures the track output. This includes both live instrument and the output of the track if you play it.

    The plug-in did crash when I tried adding another instrument. I’m guessing that GB may unload or disconnect things when it switches windows. It was not obvious but the AUM receiver said the other side was muted. Shame there is no master track effects slot in GB. I’ll see if I can capture multiple tracks tomorrow...

  • This is awesome @MisplacedDevelopmet. Connected Slate where I had three sub-tracks driving it. Exported that MIDI and re-imported to LK in AUM. Works so great. Maybe a little "canny" sound, not sure. Can be the drums.
    Now over to next track, a Obsidian for sure. Thanks for the clear description.

  • edited January 26

    Been using Sonobus for a couple of weeks now and it's really a fantastic app, opening a lot of possibilities for desktop and iOS. I'm generally using WiFi Ad-hoc network connections.

    Last week I've used it to stream 8 audio channels from Drambo to my desktop DAW using 4 Sonobus VST instances listening on different UDP ports. Works great. On Windows, I only had to duplicate the VST dll 3 times with different names.

    I also appreciate that it can use the Opus codec, a very high quality and low latency audio compression even at fairly low bitrates. Far better than mp3.

    @Pxlhg The canny sound must be something else 😉

  • @rs2000 said:
    Been using Sonobus for a couple of weeks now and it's really a fantastic app, opening a lot of possibilities for desktop and iOS. I'm generally using WiFi Ad-hoc network connections.

    Last week I've used it to stream 8 audio channels from Drambo to my desktop DAW using 4 Sonobus VST instances listening on different UDP ports. Works great. On Windows, I only had to duplicate the VST dll 3 times with different names.

    When, I think it was Espiegel introduces this here on the forum I tried to connect to Reason on my PC but, well, I'm not good with this stuff and, ergo: I failed. You wouldn't mind doing a similar step by step like above for: how you connected iPad <> Windows (DAW), would you? When you say you duplicate a .dll I get anxious.. :)

    I also appreciate that it can use the Opus codec, a very high quality and low latency audio compression even at fairly low bitrates. Far better than mp3.

    @Pxlhg The canny sound must be something else 😉

    Ha ha, yeah well, there is room for echo between my ears. Jokes aside, I recorded the piece and it sound like it does in NS2 but, a very slight delay I think. I'll check later when that before mentioned gap fills up again :)

  • I’m glad you guys are making such creative uses of SonoBus!

    @rs2000 I’m curious why you needed to duplicate the DLL of the plugin though... why was that necessary?

  • edited January 26

    @sonosaurus said:
    I’m glad you guys are making such creative uses of SonoBus!

    @rs2000 I’m curious why you needed to duplicate the DLL of the plugin though... why was that necessary?

    Because my DAW (Synapse Orion v8, 64bit) only allowed a single instance of the dll. Loading it in a second effect slot automatically removed it from the first. Until I duplicated it.
    Is there something in the VST DSK that you can set to make it behave like this?

  • Just tried it with Zenbeats and AUM and it actually works!. As you know Zenbeats doesn’t have audiobus support or any means of integrating other apps. There’s a very slight delay, like you can hear ratcheted drums if both the original signal and sonobus are on. Very impressive, this is super cool.
    This app 2 app within an iPad scenario is obviously not what Sonobus was intended for. It feels like having a friend sitting next to you in a sofa and, instead of talking, sending him a voice message through a server in Russia!. This iOS thingy is crazy sometimes.
    Amazing job Sonobus!.

  • Hi there ! Being toying with Ethernet connection but have been hitting a wall : I set up a Switch but it added an awful lot of latency (like 60ms)...WiFi brings a smaller delay, which is weird.
    But it's crazy !!!!

  • edited January 26

    @tahiche it has been a while since I used ZenBeats so I would be keen to hear some ways this could be used in there. I'm aware of the new synth and the drum sequencer but I wonder, for example, if there are any audio effects that you can add in ZenBeats which would be handy to have in AUM?

    If anyone else can think of other apps where SonoBus might be used like this then do mention them. The ability effectively opens up instruments and effects in apps as "live" audio units if they have the facility to host SonoBus.

    I've done a bit more testing in GB and the following instruments seem to work with SonoBus (basically, all those I have tested so far):

    • Guitars
    • Drums (Accoustic + Beat Sequencer)
    • Bass
    • Alchemy

    I was at one point able to attach SonoBus to 3 or 4 different instruments in GB. The plugin does not seem too stable in GB though and it would be randomly be unloaded if you switch instruments. I stand to be corrected but I think you can only live play one instrument at a time in GB anyway so you only really need one instance of SonoBus running at a time for that purpose.

    To get MIDI into GB I created a virtual->virtual MIDI port in AudioBus and routed MIDI to this port. If the instruments had focus then they would be controlled and if they had SonoBus in an effects slot then the audio would be routed out of GB.

    I was also able to use the GB audio effects by having some audio in AUM going to an IAA slot on the output and then choosing it as the input under "Audio Recorder". I don't know if GB offers anything particularly that other AU effects don't already cover (live pitch shift?) but this would be a way of using them in AUM if you don't already own enough reverbs.

    I didn't try the AMP but it has the same input options as the "Audio Recorder" and an effects slot so should work the same way. The delay may make it difficult to use for live performance if you are sensitive to it.

    Playing each part of a drum MIDI file separately would let you record drum stems. I don't see much advantage over doing this directly in GB other than that you can use the various MIDI filtering tools in the AB/AUM host to help mute the different drum MIDI parts for each take. You would also be potentially seeing sync issues when trying to combine the stems.

    There is a noticeable chorus effect when playing as you will hear the output from GB as well as the slightly delayed output from SonoBus in your host. It may be that switching off the option to play GB in the background fixes this but I did not try that.

  • @jazzmess said:
    Hi there ! Being toying with Ethernet connection but have been hitting a wall : I set up a Switch but it added an awful lot of latency (like 60ms)...WiFi brings a smaller delay, which is weird.
    But it's crazy !!!!

    Try a direct (crossover) connection, most modern ethernet interfaces will auto crossover.

  • @jazzmess said:
    Hi there ! Being toying with Ethernet connection but have been hitting a wall : I set up a Switch but it added an awful lot of latency (like 60ms)...WiFi brings a smaller delay, which is weird.
    But it's crazy !!!!

    How is the ethernet connect to your iPad? I wonder if the driver is a problem. I was looking for an ethernet connector for my iPad but ended up not being one because all the ones that I saw had comments about there not being a driver for iOS.

  • edited January 26

    @rs2000 said:

    @jazzmess said:
    Hi there ! Being toying with Ethernet connection but have been hitting a wall : I set up a Switch but it added an awful lot of latency (like 60ms)...WiFi brings a smaller delay, which is weird.
    But it's crazy !!!!

    Try a direct (crossover) connection, most modern ethernet interfaces will auto crossover.

    Thanks man, always helpful and relevant answers ! The setup was an iPad mini, 2 Air 2020, 1 pc.

    I used a basic network switch connected to one port of my router, and used dongles (lightning for the iPad mini, vava hub for the 2020 air), maybe the dongles induced latency...
    I returned all the dongles and the switch. I feel I have to try again : what would you recommend as the setup and the gear to use ? I want to go the cheapest route, and I don't need internet access on those devices.
    Do you think it would work to do a local Lan (without connecting to my internet access (with an integrated router which gives internet access), and to plug in the switch all of them ?

  • @sonosaurus : any tips about non-wifi ethernet connections for iOS devices? Any recommend dongles?

  • @jazzmess Sure you can set up a local LAN network. I don't have any USB Ethernet dongle here but direct Ad-hoc WiFi connections with my Win and Mac work well enough here.
    On Mac: WiFi > Create network.

    On Win: Open a commend shell (hit Start, type cmd, right-click on it and execute as Administrator)

    Now enter this:
    netsh wlan set hostednetwork mode=allow ssid=networkname key=password

    Then, to start the network:
    netsh wlan start hostednetwork

    The network should now be visible to connect to on your iDevice under the name you've chosen as "networkname".

    A few things can go wrong. For example, your WiFi adapter might not support Ad-hoc mode. In that case, just use your default WiFi router. Latency will be higher and you'll get more jitter but that's life.

  • edited January 27

    Two questions regarding localhost network operation:

    1. Did you figure out a way to compensate the latency?

    2. Does AUM and Zenbeats restore the Sonobus configuration via AUv3 state saving? Do they automatically reconnect after startup?

    Just my 2ct for the local use case... instead fiddling around with Sonobus that is focused on jamming with networked computers, wouldn‘t it be much better to create a set of companion AUv3 plugins that actually connect any AUv3 host with AudioBus? IMHO this would rule out all the latency of the iOS networking stack. One AUv3 audio FX that is a AB source and another that is a AUv3 that is an AB destination. I think that would allow automatic latency compensation. Now I think its crazy that this does not exist.

  • @krassmann For me the latency is more than acceptable for the facility it offers, especially for a product still in testing. There are however options inside SonoBus to tune some of the jitter and latency.

    Not sure about the state saving, I will need to look at this later but I suspect some manual intervention will be required to reconnect ends of the link due to the primary use of the app being to connect people across the net,

    I agree that if it were technically possible then having an AU that was just for bussing audio internally over shared IPC would be a good thing.

    This (mis)use of SonoBus is not what it was primarily designed for and so it may be that down the line there are optimisations that can be made for direct connections if the use case becomes popular.

  • @tahiche said:
    Just tried it with Zenbeats and AUM and it actually works!. As you know Zenbeats doesn’t have audiobus support or any means of integrating other apps. There’s a very slight delay, like you can hear ratcheted drums if both the original signal and sonobus are on. Very impressive, this is super cool.
    This app 2 app within an iPad scenario is obviously not what Sonobus was intended for. It feels like having a friend sitting next to you in a sofa and, instead of talking, sending him a voice message through a server in Russia!. This iOS thingy is crazy sometimes.
    Amazing job Sonobus!.

    Hey! I’m impressed of what you did. Can you explain the routing procedure? I want to route audio from AUM into ZenBeats where I want to record what is coming from AUM. Is it possible?

  • @bot0vod said:

    @tahiche said:
    Just tried it with Zenbeats and AUM and it actually works!. As you know Zenbeats doesn’t have audiobus support or any means of integrating other apps. There’s a very slight delay, like you can hear ratcheted drums if both the original signal and sonobus are on. Very impressive, this is super cool.
    This app 2 app within an iPad scenario is obviously not what Sonobus was intended for. It feels like having a friend sitting next to you in a sofa and, instead of talking, sending him a voice message through a server in Russia!. This iOS thingy is crazy sometimes.
    Amazing job Sonobus!.

    Hey! I’m impressed of what you did. Can you explain the routing procedure? I want to route audio from AUM into ZenBeats where I want to record what is coming from AUM. Is it possible?

    Not impressive at all!. I just followed the method outlined by @MisplacedDevelopment . One instance in AUM and one in Zenbeats. You look at the port number and enter it in the connection tab. It was a quick test to see if it worked. If there’s audio in the receiving Sonobus instance you should be able to record it.
    It’s crazy that you have to send the audio out of the iPad and back in to be able to route audio between apps on the same device.
    @MisplacedDevelopment i use Zenbeats to record audio in clip mode. Guitars, bass, etc. use case would be... you get something going in AUM, you record some audio in Zenbeats (no need to engage sonobus at this point since Zenbeats and AUM can sync via ableton live) and after that you can route the AUM channels to Zenbeats via sonobus to keep working on the song. Or the other way around... route the Zenbeats audio clips back to AUM to put them in a sampler or something...

    Now that we’re here... Is there an AUV3 plugin to fix latency in tracks?. To fix the lag post-recording. Something like Logics parameter where you can nudge the audio by ms... Itd have to nudge the audio forward, which might be impossible.

    Cheers!

  • No time to test at the moment but my gut feeling is that routing multiple SonoBus instances to a main SonoBus instance would give you the best chance of getting things in sync as there are then levers you can pull to help sync connected clients up. It may even be the case that the “server” side of the link does this for you automatically as it needs to handle much higher latencies when used across the net.

    For example, two AUM/AB channels going into SB and your ZenBeats audio going into an SB. These all connect to an instance of SB on its own AUM/AB channel. You can then adjust the latency to suit and then either record the mix in the host or even route it back into a separate ZB audio track.

  • @espiegel123 said:

    @jazzmess said:
    Hi there ! Being toying with Ethernet connection but have been hitting a wall : I set up a Switch but it added an awful lot of latency (like 60ms)...WiFi brings a smaller delay, which is weird.
    But it's crazy !!!!

    How is the ethernet connect to your iPad? I wonder if the driver is a problem. I was looking for an ethernet connector for my iPad but ended up not being one because all the ones that I saw had comments about there not being a driver for iOS.

    Sorry I didn't answer your message : bought a usb hub with Ethernet port (being sure it work iPads, some don't). Will be trying some more :-)

  • New beta features added:

  • Is Sonobus still in Beta version ?

  • @raimundoarriagada yes, it is an open beta though

  • @espiegel123 just saw this post...I've been using a DIGITECH USB 2 HUB: 3 USB ports and Ethernet port. No driver needed just works. Bought it from a local electronics shop for AUS$25 and it's been working a treat.
    It looks like this: I think you can get a gigabit version too.
    https://www.jaycar.com.au/usb-2-0-to-ethernet-adaptor-with-3-port-usb-hub/p/YN8407?pos=11&queryId=39f45d49552e54f0ab4639bf637e6e6e&sort=relevance

    The manufacturer is Digitech.

Sign In or Register to comment.