Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VisualSwift Public Beta

124

Comments

  • Build 71:
    MIDI-TX-RX now works as a MIDI source and allows enabling/disabling of Tx/Rx.

  • edited January 2022

    @Jorge Been trying out the midi module beta today. Used it back into VisualSwift playground from AUM no problems. Also out from AUM into Cubasis 3 no problems. Get instant crashing when loading RX,TX module into Loopy Pro though. Also tried a IAA midi sender app TC Data to send midi through the RX TX module both open in AUM then external through to Cubasis 3 but couldn’t seem to get the connection to work. Was thinking this could come in handy for IAA midi only apps when sending externally. Just realized actually that midi module not showing up in Loopy Pro it’s the Audio Only RX module. Maybe you can make a distinction between the 2 slightly clearer with a different symbol or something.

  • You code quicker than I cook!

    Audio Evolution Mobile works with Rx as a MIDI instrument routed through AUM and back in.

    GarageBand must have some clever things going on as it somehow is managing to block MIDI from the Rx app.

    NS2 does not work so well. Previewing individual notes works fine but playing back a clip does not. I noticed NS2 sends a lot of extra MIDI data including MIDI clock which I tried to filter out using StreamByter but that did not help.

    I will try Zenbeats later if I get time as that is another app which would benefit from better connectivity.

  • edited January 2022

    @MisplacedDevelopment said:
    You code quicker than I cook!

    Audio Evolution Mobile works with Rx as a MIDI instrument routed through AUM and back in.

    GarageBand must have some clever things going on as it somehow is managing to block MIDI from the Rx app.

    NS2 does not work so well. Previewing individual notes works fine but playing back a clip does not. I noticed NS2 sends a lot of extra MIDI data including MIDI clock which I tried to filter out using StreamByter but that did not help.

    I will try Zenbeats later if I get time as that is another app which would benefit from better connectivity.

    It makes sense that playing back a clip doesn't work well as I haven't yet transmitted the event's offset into the buffer to place them with sample precision. Basically at the moment all the notes that fall inside the same buffer play at the same time.

    I'm starting to think about having some kind of MIDI events monitor included with the MIDI-Tx-Rx GUI.

  • @Jorge said:

    @MisplacedDevelopment said:
    You code quicker than I cook!

    Audio Evolution Mobile works with Rx as a MIDI instrument routed through AUM and back in.

    GarageBand must have some clever things going on as it somehow is managing to block MIDI from the Rx app.

    NS2 does not work so well. Previewing individual notes works fine but playing back a clip does not. I noticed NS2 sends a lot of extra MIDI data including MIDI clock which I tried to filter out using StreamByter but that did not help.

    I will try Zenbeats later if I get time as that is another app which would benefit from better connectivity.

    It makes sense that playing back a clip doesn't work well as I haven't yet transmitted the event's offset into the buffer to place them with sample precision. Basically at the moment all the notes that fall inside the same buffer play at the same time.

    Understood. Just tried Drambo as well and that worked first time (Drambo using Rx to send MIDI to AUM which sends it to iM1 running as an IAA back into Tx to Drambo)

  • @MisplacedDevelopment said:
    You code quicker than I cook!

    Audio Evolution Mobile works with Rx as a MIDI instrument routed through AUM and back in.

    GarageBand must have some clever things going on as it somehow is managing to block MIDI from the Rx app.

    NS2 does not work so well. Previewing individual notes works fine but playing back a clip does not. I noticed NS2 sends a lot of extra MIDI data including MIDI clock which I tried to filter out using StreamByter but that did not help.

    I will try Zenbeats later if I get time as that is another app which would benefit from better connectivity.

    Yes was just trying the midi beta into BM3 when the updated beta dropped and caused my instrument to hang HA , HA @Jorge a fast coder! Was working ok prior to that I should add.

    I’ve been using StreamByter Nimble looper into the new Midi Module great for doing looping into external apps using TX , RX

  • Zenbeats also works with the Rx plugin. Realtime render lets you mixdown with Rx, which is promising, though as discussed the MIDI is not expected to render as written at present. It is a shame you can’t record wet FX plugins on audio tracks in ZB as the audio routes nicely in using the audio effect version of Rx.

  • @Jorge : the RX instrument is not working for me to receive audio sent via the transmit plugin from AUM to Loopy Pro in the latest beta. The RX effect works fine to receive the audio.

    This was working for me in yesterday's beta.

  • @espiegel123 said:
    @Jorge : the RX instrument is not working for me to receive audio sent via the transmit plugin from AUM to Loopy Pro in the latest beta. The RX effect works fine to receive the audio.

    This was working for me in yesterday's beta.

    Thanks, if RX Effect works and RX Instrument doesn't it should be easy to troubleshoot by comparing the code between them.

  • edited January 2022

    Build 73:
    MIDI notes weren't playing with sample precision.
    MIDI-Tx-Rx plugin now transmits buffer offset information for the receiver to place events with sample precision.
    New VisualSwift-Mon MIDI Processor for debugging issues with MIDI-Tx-Rx.
    Before connecting or disconnecting the new VisualSwift-Mon plugin stop the audio otherwise it might crash.

    VisualSwift-Mon shows four columns:
    1 - buffer size
    2 - event offset inside the buffer
    3,4 and 5 - MIDI data

    The offsets should always be between 0 and buffer size.
    At the moment there is no protection against invalid offsets in order to help find issues.

    Note in the screenshot above that all the offsets before the transmission and after the reception match each other and are within the correct range.

  • @Jorge I am not able to route MIDI with the new build. The TxRx plugin crashes when the Rx plugin sends MIDI in a basic AUM setup:

  • @MisplacedDevelopment said:
    @Jorge I am not able to route MIDI with the new build. The TxRx plugin crashes when the Rx plugin sends MIDI in a basic AUM setup:

    Thanks, makes sense. I forgot to update the Rx instrument to send MIDI using the new system that includes transmitting buffer offset. Will fix shortly after dinner.

  • edited January 2022

    Build 75:
    VisualSwift-RX audio plugin is now able to transmit MIDI including buffer offset for sample precision. It was causing a crash before. The Tx-Rx MIDI Processor plugin shows some debug information.

    I got the following working with MIDI flowing from right to left and audio coming out of the right most channel.

  • Thanks Jorge. I can now hear live notes fine but this test with Atom 2 driving Rx only sounds notes infrequently. Spacing the notes apart works OK. I have included the debug data in the picture (AUM @ 48K, 512 buffer)

  • edited January 2022

    @MisplacedDevelopment said:
    Thanks Jorge. I can now hear live notes fine but this test with Atom 2 driving Rx only sounds notes infrequently. Spacing the notes apart works OK. I have included the debug data in the picture (AUM @ 48K, 512 buffer)

    Thanks, I'll use your setup to troubleshoot.
    That debug info is very useful.

  • edited January 2022

    Build 76:
    Transmitting MIDI was skipping some events.

    @MisplacedDevelopment
    Thanks to your help I was able to replicate the issue and easily locate and fix the problem in the code.
    I'll get there eventually.

  • edited January 2022

    The fix worked! I tried NS2 earlier and it was not working great but on my second iPad it is running fine so there may be a buffer size mismatch which I’ll take a look at tomorrow.

    MIDI clips now play correctly in NS2 using an Rx module connected to an instrument hosted by AUM. This means I was able to play an IAA instrument using MIDI in NS2 and not have to set up any MIDI routing in NS2 itself, all I added was the Rx plugin. I tried it with a second track + IAA and that worked fine also.

    I shut down NS2 and AUM and reloaded both. Hit play in NS2 and everything started playing without needing to set things up again, which is fantastic!

    Here is the AUM setup:


    Here is one of the Rx plugins in NS2 (the other routes to channel B ):

  • edited January 2022

    @MisplacedDevelopment
    Great! Thanks.
    You're using a buffer size = 256, for me that would cause crackling, maybe you have a better iPad, mine is 2018 12.9 iPad Pro.

    I'll need to tidy it all up, it's getting confusing with Rx and Tx names etc, I'm thinking the names could be VS-RxA-TxM Instrument, VS-TxA Effect and VS-TxM-RxM Midi Processor. Or drop the VS and expand the M to RxA-TxMIDI, TxA and TxMIDI-RxMIDI.

  • I’m using Air 4 and Mini 6 so they can do quite a bit before the crackles start. I did notice though that with the Audio Tx module added that displayed CPU went up to 50% on the Air 4.

    Is it possible to have slightly different icons for the different modules to differentiate them?

  • edited January 2022

    @MisplacedDevelopment said:
    I’m using Air 4 and Mini 6 so they can do quite a bit before the crackles start. I did notice though that with the Audio Tx module added that displayed CPU went up to 50% on the Air 4.

    Is it possible to have slightly different icons for the different modules to differentiate them?

    Yes definitely, I thought the same. Just a question of finding appropriate icons. Not sure if you know SF Symbols, there's a free Apple MacOS app to browse them, they're very easy to use in code. I tried to find an appropriate SF Symbol for Rx and Tx but didn't come to a conclusion, there's antennas and waves etc. Actually I just remembered that SF Symbols can't be used for App icons so probably the same applies to extensions.

    If you increase the buffer size to 1024 the CPU percentage goes down a lot. I think there's still room for optimisations too.

    I'll need to clean up the GUIs too.

    Edit:
    The Audio Tx is multiplying the input buffer by ThroughVolume and TxVolume. I think it could make a big improvement in performance to just have switches to enable/disable through/tx.

  • @MisplacedDevelopment said:
    The fix worked! I tried NS2 earlier and it was not working great but on my second iPad it is running fine so there may be a buffer size mismatch which I’ll take a look at tomorrow.

    I followed up on this and the problem was indeed to do with mismatched buffer sizes. Setting NS2 to “medium” and AUM to 256 fixed the issues I was having. It is important to check the sample rate and buffer sizes of the sender and receiver if things are not behaving as they should.

  • @Jorge is there anything in particular we need to look out for with the latest beta update (Improved Audio and MIDI engines)?

  • edited February 2022

    @MisplacedDevelopment said:
    @Jorge is there anything in particular we need to look out for with the latest beta update (Improved Audio and MIDI engines)?

    Thanks for asking.

    The way the AudioUnits behave when you connect and disconnect components has changed a lot.
    I think it's a much better system but because it's very different there is potential for new bugs.
    Here's what I expect to be improved:

    1) When you disconnect a component, the audio is not temporarily stopped and should remain playing.
    2) MIDI events should now render more reliably with sample precision.

    I've collected a few plugins that in the past I had trouble making all of them work ( fixing one would break others), they are:
    1) Elsa: it used to crash if a MIDI event would fall outside the current buffer.
    2) DubFilter: before it wouldn't load properly
    3) AudioStation: before I could change the code to make AudioStation work or Elsa but not both at the same time.

    They have all been working fine for me now with the latest TF VisualSwift v1.2.7 build 78 which if it works well will become the App Store VisualSwift v1.2.21.

    Effects now only load if they have some kind of upstream generator like an instrument.

    The best test you can make is to use the app as you would normally to make music and report any issues you might have.

    If it works well I'll start exposing some components related to making synthesizers and visualisations. I'll also go back to the TX and RX components and make them more user friendly between apps with different buffer sizes.

    Things that were challenging to implement and are worth testing:
    1) Multiple Audio/MIDI outputs connected to the same input
    2) One Audio/MIDI output connected to multiple inputs

    Please let me know your priorities too as I want to make the most of your help with testing.

  • @Jorge I have a bit of time so I have started looking at playing with VisualSwift again and cannot find the Tx/Rx module since updating. I thought you might have redesigned the way the modules work but can’t find one that will appear as a MIDI source in AUM. This applies to beta latest and the latest AppStore version.

  • edited February 2022

    @MisplacedDevelopment said:
    @Jorge I have a bit of time so I have started looking at playing with VisualSwift again and cannot find the Tx/Rx module since updating. I thought you might have redesigned the way the modules work but can’t find one that will appear as a MIDI source in AUM. This applies to beta latest and the latest AppStore version.

    I'm trying to figure out why. In the code it all looks fine, the app includes 3 embedded audio units: VisualSwift-RX instrument, VisualSwift-TX effect and VisualSwift-RX effect. I wonder if killing the app and opening again would solve the issues.

    On the double-tap menu there should also be a TX component.

    EDIT: OK, I guess that answers it, I forgot to included the MIDI-TX-RX MIDI Processor in the TestFlight version, I'll fix it and upload.

  • edited February 2022

    @MisplacedDevelopment said:
    @Jorge I have a bit of time so I have started looking at playing with VisualSwift again and cannot find the Tx/Rx module since updating. I thought you might have redesigned the way the modules work but can’t find one that will appear as a MIDI source in AUM. This applies to beta latest and the latest AppStore version.

    Thanks, this is now fixed in the latest TestFlight v1.2.7 build 79: https://testflight.apple.com/join/9KePAn4p

  • @Jorge said:

    @MisplacedDevelopment said:
    @Jorge I have a bit of time so I have started looking at playing with VisualSwift again and cannot find the Tx/Rx module since updating. I thought you might have redesigned the way the modules work but can’t find one that will appear as a MIDI source in AUM. This applies to beta latest and the latest AppStore version.

    Thanks, this is now fixed in the latest TestFlight v1.2.7 build 79: https://testflight.apple.com/join/9KePAn4p

    Cheers Jorge, it is back now.

  • @Jorge - sorry if I've missed something but I've read all through the threads here and the manual and can't see if there's a way of selecting a normal (not a visualswift RX) midi receive channel for a plugin instrument. My specific use case is to run one instance of a midi generating plugin which has selectable midi outputs for individual sequences (like Riffer or Fugue Machine) into a number of plugin instruments which are listening on seperate channels and so respond to the different sequences. Is this currently possible, or is it something you can add?

  • @enkaytee said:
    @Jorge - sorry if I've missed something but I've read all through the threads here and the manual and can't see if there's a way of selecting a normal (not a visualswift RX) midi receive channel for a plugin instrument. My specific use case is to run one instance of a midi generating plugin which has selectable midi outputs for individual sequences (like Riffer or Fugue Machine) into a number of plugin instruments which are listening on seperate channels and so respond to the different sequences. Is this currently possible, or is it something you can add?

    Thanks, I see how the VisualSwift channels can get easily confused with MIDI channels.

    I hope I understood what you're trying to do. Basically you require TX and RX components that work with MIDI channels.

    The TX component would have one MIDI Input and one MIDI Output. All the MIDI events that arrive at the input would be sent to one of 16 selected MIDI channels.

    The RX component would also have one MIDI Input and one MIDI Output. All the MIDI events arriving with a specified MIDI channel would be sent to the output.

    Actually maybe even better would be to allow you to select multiple channels.

    If I understood it right these components are very easy to implement, I'll try to create them quickly while your interest is fresh.

  • edited February 2022

    @Jorge said:

    @enkaytee said:
    @Jorge - sorry if I've missed something but I've read all through the threads here and the manual and can't see if there's a way of selecting a normal (not a visualswift RX) midi receive channel for a plugin instrument. My specific use case is to run one instance of a midi generating plugin which has selectable midi outputs for individual sequences (like Riffer or Fugue Machine) into a number of plugin instruments which are listening on seperate channels and so respond to the different sequences. Is this currently possible, or is it something you can add?

    Thanks, I see how the VisualSwift channels can get easily confused with MIDI channels.

    I hope I understood what you're trying to do. Basically you require TX and RX components that work with MIDI channels.

    The TX component would have one MIDI Input and one MIDI Output. All the MIDI events that arrive at the input would be sent to one of 16 selected MIDI channels.

    The RX component would also have one MIDI Input and one MIDI Output. All the MIDI events arriving with a specified MIDI channel would be sent to the output.

    Actually maybe even better would be to allow you to select multiple channels.

    If I understood it right these components are very easy to implement, I'll try to create them quickly while your interest is fresh.

    I think that sounds right....forgive me - I'm a bit lost in the TX/RX components!

    I'll give an example in AUM - Riffer is a midi generator with 4 independent sequencers which by default are transmitting on MIDI channels 1 - 4. I load one instance in an AUM MIDI channel, then load up to 4 different instruments in Audio channels. In AUM, I can then assign a different MIDI receive channel for each instrument so they respond accordingly, so in the screenshot, Pure Piano is receiving MIDI from Riffer only on channel 1 - is this what you're intending with the RX component you mention? Will it have a selectable 'MIDI receive' channel? Thanks!

    Riffer MIDI channel TX assignment:

    Pure Piano MIDI channel RX assignment:

Sign In or Register to comment.