Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VisualSwift Public Beta

245

Comments

  • edited January 2022

    @MisplacedDevelopment said:
    Going from GB guitar to Loopy or AUM the sound is distorted.

    I was able to drive 3 instances of Obsidian from AUM by pointing 3 Atoms at separate tracks and used 3 Rx in AUM and 1 Tx on each NS2 track. I closed both AUM and NS2 then started NS2 (so its MIDI input port was visible), then the AUM session. Hit play and it all started playing.

    Thanks for testing it. I still don't know how well it works in terms of performance but there might be some room for optimisations.

    One thing I've noticed, with just 1 or 2 connections ( I haven't tried more ) is that the buffer size should be 1024, it also worked well for me with buffers of 512.

    I noticed for example, if I start the VisualSwift main app first, it sets the buffer size to 256, then I start AUM and it stays at 256 causing crackling. If I start AUM first and make sure buffer size = 1024, then I start VisualSwift, after that the buffer size should be 1024 and you're able to change it inside AUM. Maybe I didn't explain that very well.

    So basically I wonder when you get distorted sound what is the buffer size, you can check in the AUM settings as I think all running apps will share the same size, when you change in one, the others pick it up.

  • @Jorge said:
    TestFlight v1.2.7 build 53 is now available.

    The TX component can now transmmit to one of 16 channels.
    The RX plugin can now receive from one of 16 channels.

    Here's an example of one RX plugin inside AUM:

    And here's an example of a schematic that sends one stream to AUM's channel 5 and another to an RX plugin hosted inside VisualSwift through channel 1.

    The RX plugin's channel index is a parameter which allows it to be controlled by MIDI and automated.

    Hi @Jorge Just to be clear the TX RX AU are for sending midi only not audio between apps is that correct would love to see a video of this being show how it works have hit a block on how to set it up on a hazy Sunday morning!

  • @Jorge said:

    @MisplacedDevelopment said:
    Going from GB guitar to Loopy or AUM the sound is distorted.

    I was able to drive 3 instances of Obsidian from AUM by pointing 3 Atoms at separate tracks and used 3 Rx in AUM and 1 Tx on each NS2 track. I closed both AUM and NS2 then started NS2 (so its MIDI input port was visible), then the AUM session. Hit play and it all started playing.

    Thanks for testing it. I still don't know how well it works in terms of performance but there might be some room for optimisations.

    One thing I've noticed, with just 1 or 2 connections ( I haven't tried more ) is that the buffer size should be 1024, it also worked well for me with buffers of 512.

    I noticed for example, if I start the VisualSwift main app first, it sets the buffer size to 256, then I start AUM and it stays at 256 causing crackling. If I start AUM first and make sure buffer size = 1024, then I start VisualSwift, after that the buffer size should be 1024 and you're able to change it inside AUM. Maybe I didn't explain that very well.

    So basically I wonder when you get distorted sound what is the buffer size, you can check in the AUM settings as I think all running apps will share the same size, when you change in one, the others pick it up.

    I will have a play with the buffer sizes later to see if this helps.

    A mute in/out switch would be useful to have to avoid the need to play with faders.

    I added a track of Cubasis 3 to the Obsidians to make 4 tracks. The thing to remember with Cubasis is that the track you want to listen to needs to be armed for record. Tried Auria but things started crackling. Auria has other unrelated problems with choosing the right samplerate so it may take some fiddling - worth it though to be able to access its synths outside of the app!

  • edited January 2022

    @Jumpercollins these are for audio sending. As a simple test you can set up what is in effect a bus in AUM like this:

    The Tx plugin gets added in front of the audio source or instrument you want to send and the Rx plugin gets added as an instrument where you want to receive the audio. You then just need to make sure the send and receive channels are the same between the plugins.

    Now you can move these plugins into different apps to access sounds in places you wouldn’t normally be able to!

  • edited January 2022

    @Jumpercollins said:

    Hi @Jorge Just to be clear the TX RX AU are for sending midi only not audio between apps is that correct would love to see a video of this being show how it works have hit a block on how to set it up on a hazy Sunday morning!

    This is all very fresh, just something I woke up today wanting to try as a result of a discussion earlier in this thread.
    I think the idea of MIDI TX and MIDI RX plugins is very good ( probably exists already ), they could work in a similar way to the Audio TX and Audio RX plugins. Maybe they would both be MIDI Processors and would route between selected input and output channels. You could also have some kind of UI with filtering options and other features. Actually it would probably be just one MIDI RX-TX plugin, just thinking out loud.

  • If you could do this for MIDI as well then that would be novel, I think. We currently have MIDI Bus and MIDI Route which are part of https://apps.apple.com/gb/app/midi-tools/id1446209019 The MIDI Route app lets you send MIDI out but only to public MIDI destinations. MIDI Bus lets you route to other MIDI Bus instances, but with the limitation of staying within the same host.

  • @MisplacedDevelopment said:
    If you could do this for MIDI as well then that would be novel, I think. We currently have MIDI Bus and MIDI Route which are part of https://apps.apple.com/gb/app/midi-tools/id1446209019 The MIDI Route app lets you send MIDI out but only to public MIDI destinations. MIDI Bus lets you route to other MIDI Bus instances, but with the limitation of staying within the same host.

    That's really useful, thanks.

    I had another idea, maybe a TransportRX component inside VisualSwift that would allow you to pick one of the 16 channels and receive beat position and moving state from whichever host is transmitting through that channel. VisualSwift would then play in sync with it.

  • Really glad I followed this thread. My first impression was that it didn’t do anything I couldn’t do in AUM, but with a less convenient (for me) interface. Having seen your plans for constructing instruments from low level components and similar plans for visualisers it’s become a lot more interesting! Plus the TX/RX components sound like they fill a gap in iOS audio plumbing.

    Thanks for your work on all this, I look forward to trying some of the new stuff when it hits a release version (I’m not on the beta, and TBH at the moment I’d struggle to do enough testing to provide useful feedback).

  • edited January 2022

    @MisplacedDevelopment said:
    Going from GB guitar to Loopy or AUM the sound is distorted.

    I was not able to get GB or Auria to play nicely no matter what buffer settings I used or which order I started things. Edit: Auria now working - I had it set to 44k to work around a different problem and creating a new 48k project fixed things. FF Twin 2 in AUM :smile:

    I did however manage to route audio out of Zenbeats and so was able to drive the ZC1 synth from AUM. Unfortunately I could not get audio directly into a ZB audio track, even though it could be routed in OK to an instrument using the Rx plugin. I do not believe it is possible to route and record an instrument to an audio track in ZB. I was able to route to the audio track using the ApeSoft audio bus thing but you can’t record that part of the signal.

  • @MisplacedDevelopment said:
    @Jumpercollins these are for audio sending. As a simple test you can set up what is in effect a bus in AUM like this:

    The Tx plugin gets added in front of the audio source or instrument you want to send and the Rx plugin gets added as an instrument where you want to receive the audio. You then just need to make sure the send and receive channels are the same between the plugins.

    Now you can move these plugins into different apps to access sounds in places you wouldn’t normally be able to!

    Cheers @Jorge @MisplacedDevelopment Just tried AUM into Loopy Pro using Nave which is IAA so not possible to record in loopy pro as not AU and bingo it works.

  • edited January 2022

    @Jorge the mixer on the Tx will be useful, a good addition. In the latest beta version I am seeing that the Rx plugin is missing the channel selectors:

    Edit: A reinstall fixed this!

  • I thought the volume slider was not working correctly as I was expecting it to affect the bussed signal but it actually controls the through volume.

  • edited January 2022

    @MisplacedDevelopment said:
    @Jorge the mixer on the Tx will be useful, a good addition. In the latest beta version I am seeing that the Rx plugin is missing the channel selectors:

    Edit: A reinstall fixed this!

    I was just about to suggest a reinstall when I saw your edit as the RX looked like the old version.

    It's really easy for me to break things with new versions so I really appreciate being notified quickly of possible issues even if they end up not being one.

    I hope the slider in the TX component solves your suggestion of having a mute button.
    There is also a new Audio Input component in the main app.

  • edited January 2022

    @MisplacedDevelopment said:
    I thought the volume slider was not working correctly as I was expecting it to affect the bussed signal but it actually controls the through volume.

    Yes, I can easily change it to have two sliders, the TX Volume and the Through Volume if it's useful. I thought maybe the Through volume was the more important one. They definitely need labels.

    I've optimised some of the code with vector instructions so I think it might work a bit faster now, there is still a place using a loop instead of vector instructions so might improve further in the future. Not sure if the optimisations lead to new crackling, hopefully not.

  • @Jorge said:

    @MisplacedDevelopment said:
    I thought the volume slider was not working correctly as I was expecting it to affect the bussed signal but it actually controls the through volume.

    Yes, I can easily change it to have two sliders, the TX Volume and the Through Volume if it's useful. I thought maybe the Through volume was the more important one.

    I've optimised some of the code with vector instructions so I think it might work a bit faster now, there is still a place using a loop instead of vector instructions so might improve further in the future. Not sure if the optimisations lead to new crackling, hopefully not.

    Thanks for adding this. Yes, the through volume was the more important one so that it was possible to remove the original signal from the chain but a second slider to change the bussed volume would also be handy. The new volume value state saves nicely too.

  • The user and all related content has been deleted.
  • @Jorge Has the RX AU component gone missing again with the latest build. Was trying Loopy Pro with Loopy Builder which has AU effects slot in split screen mode which should be a awesome combo to send audio between but couldn’t find the RX module in Loopy Pro ?

  • edited January 2022

    @Jumpercollins said:
    @Jorge Has the RX AU component gone missing again with the latest build. Was trying Loopy Pro with Loopy Builder which has AU effects slot in split screen mode which should be a awesome combo to send audio between but couldn’t find the RX module in Loopy Pro ?

    Interesting find, I didn't think of that. The VisualSwift-RX plugin is an instrument so it doesn't appear in the list of effects.
    I think I'll create a VisualSwift-RX Effect plugin that will mix the input with the received bus, I guess it would have two volume sliders for input and received streams.

    Or maybe there would be only one VisualSwift-RX-TX ( VisualSwift-Router? ) Effect plugin that could perform both tasks, maybe a switch would determine if it's receiving or transmitting, might let the idea settle a bit first. Could even do both receiving and transmitting at the same time and would be flexible to receive from one channel and transmit to another.

  • @Jumpercollins , in the latest v1.2.7 build 60 there is a new VisualSwift-RX Effect plugin that hopefully allows what you were trying to achieve, here's an example screenshot ( notice both TX and RX being Effects ) :

  • The latest update is ace, having the two faders on both plugins to control the signal levels works well. The audio effect version of the Rx plugin is nice to have in the toolbox, though I don’t believe there are any traditional iOS DAWs which let you directly print audio coming through an effect plugin without doing a mixdown.

    GarageBand is the only host that I am finding problems with. The signal going out of GB sounds like it did when I had Auria set to the wrong sample rate. I wonder if GB somehow sets its own internal sample rate ignoring whatever else is running, since it does not normally interact with other apps, but then I guess the plugin would be told about this?

  • edited January 2022

    @MisplacedDevelopment said:
    The latest update is ace, having the two faders on both plugins to control the signal levels works well. The audio effect version of the Rx plugin is nice to have in the toolbox, though I don’t believe there are any traditional iOS DAWs which let you directly print audio coming through an effect plugin without doing a mixdown.

    GarageBand is the only host that I am finding problems with. The signal going out of GB sounds like it did when I had Auria set to the wrong sample rate. I wonder if GB somehow sets its own internal sample rate ignoring whatever else is running, since it does not normally interact with other apps, but then I guess the plugin would be told about this?

    The main VisualSwift app needs to be aware of sampling rate but the plugins don't, the TX plugin just passes the arriving frames to the RX with whichever size iOS decides. I think the reason it all works is because behind the scenes iOS is coordinating all apps, I've noticed that when you change the frame size inside AUM, the VisualSwift internal audio engine automatically picks up the changes even though I didn't write any code to deal with it. With the plugins hosted with other apps there is no VisualSwift main app involved, it is out of the equation so I think it's a question of how those other hosts work ( like you suggest ).

    I don't know why but I think 44.1K sampling rate used to be the standard and lately it has changed to 48K. There's probably something I could do, I've seen that Apple's AudioUnit framework helps to make it easy to convert sample rates when required. So far I've been trying to keep it as simple as possible but something to investigate.

    Your testing is really useful,

  • Another big feature addition, side by side support and the ability to run multiple copies of the host app! I can’t think of any existing audio+MIDI hosts that let you run multiple instances of itself.

    I wonder what sort of use-cases this opens up? You could logically split parts of your workflow and connect the modules (instances of the app) using the Tx/Rx modules. If you play live then maybe such a setup can be used to add some resilience, e.g. a synth crashes in one instance of the app so you just switch over to an identical copy running in another window.

  • Thinking about it some more then does the ability to run multiple hosts also allow people with shiny M1 iPads to potentially make more use of their hardware? I wonder if the OS gives each copy of the host a separate RAM and CPU quota which might let people who struggle using a single instance of say, AUM, to have more plugins running.

  • edited January 2022

    @MisplacedDevelopment said:
    Another big feature addition, side by side support and the ability to run multiple copies of the host app! I can’t think of any existing audio+MIDI hosts that let you run multiple instances of itself.

    I wonder what sort of use-cases this opens up? You could logically split parts of your workflow and connect the modules (instances of the app) using the Tx/Rx modules. If you play live then maybe such a setup can be used to add some resilience, e.g. a synth crashes in one instance of the app so you just switch over to an identical copy running in another window.

    The main use-case would be drag-copying a custom module from one schematic to the other, or a plugin with it's full state, or a selection of components.

    I have many hundreds of components hidden as I realise they would just scare users and need to be released in a very controlled manner with documentation. You should find about a lot of them in the VisualSwift YouTube channel and the VisualSwift forum. There is a Module component that in combination with the Input and Output components allow you to create hierarchical schematics, then it becomes very useful to be able to drag one of those custom modules from one schematic to another.

    Another similar use case that I have hidden is to be able to drag Images and Audio files from the Files app to a schematic. Audio files would open in an Audio player and Images in an Image UI component that you can use in a custom front panel. The whole system for creating UIs is implemented too and hidden in the latest versions, it follows the way SwiftUI works as shown in some of my videos. My struggle is how to release and document it all. If you have suggestions they're very welcome. I'm currently thinking of keeping the app very simple until you unlock the SynthMaker / Visualiser / MIDI features so that only advanced users need to see them.

  • I’ve been buried with work and stuff since asking about the multi-channel TX/RX and am finally now catching up on where you’ve taken that concept. I’m really excited to try it all out and thank you for building something really special.

  • edited January 2022

    Build 62 introduces drag-copy as in the following screenshot:

    You might for example collect in a schematic a few sequencers with patterns that you want to reuse in the future.

    To Drag-Copy do a long-press on the top right corner of the component followed by a drag.

    Build 63 introduces MIDI-Out and allows selection of Sources and Destinations.

    Bending links wasn't working when zoomed in or out. Note that bending links requires a trackpad.

    Here's an example showing the new features of the latest build:

  • @mulletsaison said:
    I’ve been buried with work and stuff since asking about the multi-channel TX/RX and am finally now catching up on where you’ve taken that concept. I’m really excited to try it all out and thank you for building something really special.

    Thanks, I'm looking forward to knowing if it works for you and any suggestions you might have.

  • I didn't know there was a forum:

    https://www.visualswift.com/forum/

    Not much there yet... get in on the ground floor and centralize the discussion of features and use cases.

    There are also a large collection of Youtube videos:

    https://www.youtube.com/channel/UC9iP2bfGRdFPfYy9ALEvEqA/videos

  • McDMcD
    edited January 2022

    EDIT: Oops PM text copied into this thread.

  • @Jorge said:

    @MisplacedDevelopment said:
    Another big feature addition, side by side support and the ability to run multiple copies of the host app! I can’t think of any existing audio+MIDI hosts that let you run multiple instances of itself.

    I wonder what sort of use-cases this opens up? You could logically split parts of your workflow and connect the modules (instances of the app) using the Tx/Rx modules. If you play live then maybe such a setup can be used to add some resilience, e.g. a synth crashes in one instance of the app so you just switch over to an identical copy running in another window.

    The main use-case would be drag-copying a custom module from one schematic to the other, or a plugin with it's full state, or a selection of components.

    I have many hundreds of components hidden as I realise they would just scare users and need to be released in a very controlled manner with documentation. You should find about a lot of them in the VisualSwift YouTube channel and the VisualSwift forum. There is a Module component that in combination with the Input and Output components allow you to create hierarchical schematics, then it becomes very useful to be able to drag one of those custom modules from one schematic to another.

    Another similar use case that I have hidden is to be able to drag Images and Audio files from the Files app to a schematic. Audio files would open in an Audio player and Images in an Image UI component that you can use in a custom front panel. The whole system for creating UIs is implemented too and hidden in the latest versions, it follows the way SwiftUI works as shown in some of my videos. My struggle is how to release and document it all. If you have suggestions they're very welcome. I'm currently thinking of keeping the app very simple until you unlock the SynthMaker / Visualiser / MIDI features so that only advanced users need to see them.

    Hundreds of hidden components?

    I've been following this thread some and am wondering if the end objective for VisualSwift is something like a schematic-based modular design system? Sort of like a much more configurable version of Drambo?

    One of my un-realized dream apps for iPad would be Polyphonic-capable modular system with a Drambo-like plethora of modules... But with an integrated UI designer with a good selection of meta-control elements for constructing custom front-ends for controlling parameters for designs of instruments, etc. constructed using the modular system.

    Nurack touches on a UI designer for effects.

    From the little I'm seeing here of VisualSwift, it looks like it might be good foundation for such a system.

    For example... Drambo introduced a design paradigm that allows for constructing complex racks, that lets the user select modules to either be shown or hidden, allowing for a less complex/cluttered view for actually using what the user constructs.

    Suppose the schematic-based-modules of VisualSwift could also be assigned with a "show or hide" type parameter. But unlike Drambo having only the two show/hide options, VisualSwift had multiple levels/layers of show/hide options....

    VisualSwift might have "view-layers".. 1,2,3,4 ... View Layer 1 could contain all modules used for a given project, but then it might get more interesting...

    Imagine building a complex synth, and utilizing components that "won't" have parameters that will require any meta-controllers... Everything can be "wired" together on View Layer 1, but all the components that "will" require meta-control can be set to display on both view-layer 1 and 2.

    Next the user switches the display view to View Layer 2.... Imagine half the Layer 1 modules disappearing and becoming hidden. On View Layer 2 the user can now drag the modules visible on only View Layer 2 into new positions, and then start adding meta-control modules on Layer 2, to control those parameters as the user desires. The meta-control modules can be set to display on view-layers 2 and 3.

    Switch to View-layer 3... Now the user only sees only the meta controls. Switches, Knobs, Sliders, Buttons, Rotary Knobs, keyboards, etc...

    On layer 3 the user can drag around the meta controls on the layer 3 schematic putting things wherever they like. In the case of building a synth, this is where the user designs "the look" of the interface. Want to build a synth that looks like a vintage synth, level 3 is where to do the gui design.

    The user could also add a layer 4, and build variations for alternate gui control surfaces for their synth.

    The key here is that the user can switch to any view layer, and all the modules assigned to display on each given layer will show up at positions and wiring relative to how they are configured on that layer.

    This would be like a three dimensional design space, like that used in circuit boards that place components on both sides. But in this case there could be more than two side used to work with, allowing for connectivity between all modules on all layers.

Sign In or Register to comment.