Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VisualSwift Public Beta

1235»

Comments

  • edited February 2022

    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

  • @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

  • edited February 2022

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

  • @Jorge said:

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

    I think the multiple channel 'grid' works OK for me - it's sort of what I'm used to in AUM...thanks again for all your work, this is really starting to take shape...

  • @enkaytee said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

    I think the multiple channel 'grid' works OK for me - it's sort of what I'm used to in AUM...thanks again for all your work, this is really starting to take shape...

    OK, I'll leave it as multiple channel, build 81 is now available.

  • edited February 2022

    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

  • @Jorge said:

    @enkaytee said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

    I think the multiple channel 'grid' works OK for me - it's sort of what I'm used to in AUM...thanks again for all your work, this is really starting to take shape...

    OK, I'll leave it as multiple channel, build 81 is now available.

    You wouldn’t want to limit to one channel as MPE synths and controllers use up to 16 channels per synth since they can use different dynamically changing channels per voice)

  • @enkaytee said:
    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

    Here's where you should find it:

    On the Libraries side bar to the right inside the MIDI section the first component is ChannelSource, you can drag it to the schematic. Make sure you see Version:1.2.7 build 81 on the top right.

    If you still don't see it let me know.

    I'm currently trying to avoid having too many components on the Double-Tap menu to keep it simple.

  • edited February 2022

    @espiegel123 said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

    I think the multiple channel 'grid' works OK for me - it's sort of what I'm used to in AUM...thanks again for all your work, this is really starting to take shape...

    OK, I'll leave it as multiple channel, build 81 is now available.

    You wouldn’t want to limit to one channel as MPE synths and controllers use up to 16 channels per synth since they can use different dynamically changing channels per voice)

    Yeah, right, but in that case you would use all the channels at the same time and probably wouldn't want to do any channel filtering and so wouldn't use the ChannelSource component at all. I could be wrong as I have very little experience with this.

    EDIT: Maybe "Select All" and "Clear All" buttons would be useful. Also I'm thinking of implementing some kind of Visual feedback in each grid cell to indicate that an event arrived on that channel.

  • @Jorge said:

    @enkaytee said:
    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

    Here's where you should find it:

    On the Libraries side bar to the right inside the MIDI section the first component is ChannelSource, you can drag it to the schematic. Make sure you see Version:1.2.7 build 81 on the top right.

    If you still don't see it let me know.

    I'm currently trying to avoid having too many components on the Double-Tap menu to keep it simple.

    This is what I see @Jorge:

  • @Jorge said:

    @espiegel123 said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:

    @enkaytee said:

    @Jorge said:
    @enkaytee

    Here's what I've done so far just to check if it's what you mean:

    In the example above Riffer is setup as normal with R1 on channel 1 and R2 on channel 2. The two new ChannelSource components allow MIDI events through depending on their channels, the top one is setup to only pass through events on channel 1 and the bottom one events on channel 2. That means that R1 is playing Pure Synth and R2 is playing Lagrange.

    Perfect - thanks @Jorge ...👍

    Cool, I'm uploading this now, it will be TestFlight v1.2.7 build 81. Should be available in the next 15 minutes or so. If you find bugs let me know. It can probably be improved for example by sending all-notes-off when you deselect a channel.

    At the moment it allows multiple channels to be selected. Might be more user friendly to just allow one or have a setting to choose.

    I think the multiple channel 'grid' works OK for me - it's sort of what I'm used to in AUM...thanks again for all your work, this is really starting to take shape...

    OK, I'll leave it as multiple channel, build 81 is now available.

    You wouldn’t want to limit to one channel as MPE synths and controllers use up to 16 channels per synth since they can use different dynamically changing channels per voice)

    Yeah, right, but in that case you would use all the channels at the same time and probably wouldn't want to do any channel filtering and so wouldn't use the ChannelSource component at all. I could be wrong as I have very little experience with this.

    EDIT: Maybe "Select All" and "Clear All" buttons would be useful. Also I'm thinking of implementing some kind of Visual feedback in each grid cell to indicate that an event arrived on that channel.

    Yes, I just wanted to make sure that you knew about this as I saw a comment up thread that made it sound like you might be considering a single channel per cable.

  • @enkaytee said:

    @Jorge said:

    @enkaytee said:
    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

    Here's where you should find it:

    On the Libraries side bar to the right inside the MIDI section the first component is ChannelSource, you can drag it to the schematic. Make sure you see Version:1.2.7 build 81 on the top right.

    If you still don't see it let me know.

    I'm currently trying to avoid having too many components on the Double-Tap menu to keep it simple.

    This is what I see @Jorge:

    Thanks, clearly something is wrong. I’m on my way home at the moment. I’ll have a look in about two hours from now and try to fix it.

  • edited February 2022

    @enkaytee said:

    @Jorge said:

    @enkaytee said:
    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

    Here's where you should find it:

    On the Libraries side bar to the right inside the MIDI section the first component is ChannelSource, you can drag it to the schematic. Make sure you see Version:1.2.7 build 81 on the top right.

    If you still don't see it let me know.

    I'm currently trying to avoid having too many components on the Double-Tap menu to keep it simple.

    This is what I see @Jorge:

    I had uploaded the wrong build, should be fine now with build 82. I was able to upload it on the move.

  • @Jorge said:

    @enkaytee said:

    @Jorge said:

    @enkaytee said:
    @Jorge - maybe I'm being stupid but I can't find the new 'channelsource' component in the pop up list or libraries...am I missing something? I've tried reinstalling...

    Here's where you should find it:

    On the Libraries side bar to the right inside the MIDI section the first component is ChannelSource, you can drag it to the schematic. Make sure you see Version:1.2.7 build 81 on the top right.

    If you still don't see it let me know.

    I'm currently trying to avoid having too many components on the Double-Tap menu to keep it simple.

    This is what I see @Jorge:

    I had uploaded the wrong build, should be fine now with build 82. I was able to upload it on the move.

    Thanks - all good now...👍

  • edited February 2022

    Now I've spent more time with this I'm beginning to appreciate the concept a lot more. One of my most used desktop music tools is Element which is basically a plugin host but which has a similar approach to linking components so I guess I'm used to this way of working but to those who aren't sure about VisualSwift I'd say give it a chance, it may just grow on you. Thanks @jorge...

  • Version 1.2.22 is now available in the App Store. Live MIDI events weren't being recognised inside MIDI-Processors. You should now be able to record live playing for example with a MIDI-In component connected to an Atom Piano Roll MIDI Processor as in the following screenshot:

  • Version 1.2.23 is now available, here's a short video showing an example made by recording an external MIDI keyboard into 3 instances of Piano Roll:

    Two of the MIDI streams are merged going to the same instrument.

  • thanks for the update @Jorge !
    any news on msl implementation?

  • edited June 2022

    @cazel said:
    thanks for the update @Jorge !
    any news on msl implementation?

    Thanks for asking, here it is in action:

    Here's a video rendered with it:

    You still see many MetalSnippet components with actual MSL code but the idea is to replace them so you don't have to write any code. It's good to be able to drop down to code though for flexibility.

    The engine runs both a fragment shader for visual effects and a compute shader for generating audio so you can create your own instruments and effects running on the GPU. On the audio side I've created oscillators, filters and envelope so you can already create simple synthesisers. There is a variable passed to the shader with the index of the note being played and also the voice inside the note. As GPUs are very powerful running many instances at the same time, you could easily have 8 or more voices per each note played and you can use the voice index to easily create detuning etc.

    The example above runs a RayMarching component but you don't have to, you can create more generic types of shaders. The challenge as you probably know is that writing shader code makes the brain hurt, it's all about splitting it into more user friendly and easy to understand visual components and functions.

    The new VisualSwift MetalInstrument component instantiates a few buffers internally that are shared by both the fragment shader and the compute shader. That way you have access to the audio buffer, the FFT version of it and MIDI streams from the fragment shader. I've created an example with Ray-Marching where each note you play on the keyboard causes a 3D object to appear.

    At some point I tried creating a piano keyboard that would respond to keys being played:

    Would be great to create a synthesiser's front panel using Ray-Marching.

  • 😱This is going to be in VisualSwift? Is that not a big deal or what? I only just started lookng into shaders since KodeLife went on sale awhile ago and wanted to do some in the other VS, but this is amazing 🤤
    And I only just heard about VSwift recently since it was mentioned for using as a workaround for Studiomux fx.

    Props, I’ll be picking this up soon.

  • McDMcD
    edited July 2022

    What are you supposed to touch to get an AUv3 app to go to full screen in VisualSwift?
    UPDATE: I found it… the Maximize widget that looks like a rectangle at the upper right.
    I have been hitting that for several minute to no effect on a “Mozaic” MIDI-Processor unit.

    UPDATE2: It looks like I didn’t have all the required connections established before the app will
    Go BLUE and open.

    I went into VisualSwift looking for all the Apple AU’s and it has more than I can see in ApeMatrix
    But it looks like they pretty basic functionality to give dev’s something to study, I think.

  • @McD said:
    What are you supposed to touch to get an AUv3 app to go to full screen in VisualSwift?
    UPDATE: I found it… the Maximize widget that looks like a rectangle at the upper right.
    I have been hitting that for several minute to no effect on a “Mozaic” MIDI-Processor unit.

    UPDATE2: It looks like I didn’t have all the required connections established before the app will
    Go BLUE and open.

    I went into VisualSwift looking for all the Apple AU’s and it has more than I can see in ApeMatrix
    But it looks like they pretty basic functionality to give dev’s something to study, I think.

    I think it would be more user friendly to have the plugins always loaded even if they are not part of the audio graph, that's how it used to work in the past but I've decided to prioritise performance and have them unload when disconnected. Maybe the best would be to have a setting for the user to choose the behaviour.

    Apple's AUs are very basic but also very easy to implement and good for testing the audio engine. As they are not very useful I kept them out from the schematic's double tap menu.

  • @Jorge I cant find VisualSwift in macappstore 👀 -is it temporary?-

  • @cazel said:
    @Jorge I cant find VisualSwift in macappstore 👀 -is it temporary?-

    Sorry, I've removed it because I'm trying to work on a small set of manageable features and so far I didn't have time to check how well or not the iPad version of VisualSwift works on MacOS. I've also been working on a version that is dedicated for MacOS ( rather than the current iPad version running on MacOS ) although it might take a while before it's ready.

    Because you mentioned it I've now made it available again.

  • @Jorge said:

    @cazel said:
    @Jorge I cant find VisualSwift in macappstore 👀 -is it temporary?-

    Sorry, I've removed it because I'm trying to work on a small set of manageable features and so far I didn't have time to check how well or not the iPad version of VisualSwift works on MacOS. I've also been working on a version that is dedicated for MacOS ( rather than the current iPad version running on MacOS ) although it might take a while before it's ready.

    Because you mentioned it I've now made it available again.

    thanks a lot @Jorge 🙏🏿
    just fyi I've been using VisualSwift on M1+M2 in Live11.2b6-7-8 without any issues so far. also with iCloud project sync.

  • @MisplacedDevelopment said:
    Going from GB guitar to Loopy or AUM the sound is distorted.

    I was able to drive 3 instances of Obsidian from AUM by pointing 3 Atoms at separate tracks and used 3 Rx in AUM and 1 Tx on each NS2 track. I closed both AUM and NS2 then started NS2 (so its MIDI input port was visible), then the AUM session. Hit play and it all started playing.

    @cazel said:

    @Jorge said:

    @cazel said:
    @Jorge I cant find VisualSwift in macappstore 👀 -is it temporary?-

    Sorry, I've removed it because I'm trying to work on a small set of manageable features and so far I didn't have time to check how well or not the iPad version of VisualSwift works on MacOS. I've also been working on a version that is dedicated for MacOS ( rather than the current iPad version running on MacOS ) although it might take a while before it's ready.

    Because you mentioned it I've now made it available again.

    thanks a lot @Jorge 🙏🏿
    just fyi I've been using VisualSwift on M1+M2 in Live11.2b6-7-8 without any issues so far. also with iCloud project sync.

    It would be awesome if Visual Swift supported Ableton Link.

  • @auxmux 🙌🏿 yes we need Link! thanks for reminding (:
    I've some max|ofx|plugdata workarounds but Link would be a lifesaver

Sign In or Register to comment.