Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VisualSwift Public Beta

135

Comments

  • @McD said:
    I didn't know there was a forum:

    https://www.visualswift.com/forum/

    Not much there yet... get in on the ground floor and centralize the discussion of features and use cases.

    There are also a large collection of Youtube videos:

    https://www.youtube.com/channel/UC9iP2bfGRdFPfYy9ALEvEqA/videos

    Hi @McD , I tried to get a forum going some time ago but it just didn't get active. I've been conscious of linking to it here. At the same time I don't want to annoy Audiobus forum members by getting this thread to the top too often. I think I'll go with the flow and see what happens. I'll keep checking the VisualSwift forum in case someone posts there and I'll reply if someone does. Maybe for the more advanced low level components might be better to use the VisualSwift forum.

    The videos are from old versions of VisualSwift and have features and components that are currently hidden but I hope to release very soon.

  • edited January 2022

    @horsetrainer said:
    Hundreds of hidden components?

    I've been following this thread some and am wondering if the end objective for VisualSwift is something like a schematic-based modular design system? Sort of like a much more configurable version of Drambo?

    Yes, the idea is that you'd create your own modules from low level components, here's an example of how you would create an ADSR module:
    https://www.visualswift.com/forum/viewtopic.php?f=2&t=52

    Users would create and share modules. More like a minority of advanced users would create modules and share with the others.

    In VisualSwift what will allow you to create custom UI is a combination of hierarchical schematics and SwiftUI concepts. I can expand on this if you have questions.

    Another feature that I have working well but still requires writing documentation and exposing the low level components before releasing is for automatic generation of an Xcode project from a VisualSwift schematic. You would then be able to build the project and submit to the App Store as demonstrated in this post:
    https://www.visualswift.com/forum/viewtopic.php?f=2&t=48

  • @Jorge said:
    Hi @McD , I tried to get a forum going some time ago but it just didn't get active.

    As your applications capability adds features I think you'll get more attention there.

    The videos are from old versions of VisualSwift and have features and components that are currently hidden but I hope to release very soon.

    I could see that but the right set of features will make that very useful.

    I just took the TX/RX capability on channel 1 to pipe IAA "Funk Drummer" from AUM into VisualSwift. I couldn't get the Ableton Link to work for Start/Stop as it does but AUM also has Start/Stop for Ableton Link so maybe Funk Drummer connet=cted to AUM and that blocked VisualSwift.

    I'll keep testing to see if I can make that work. I also send a "Noise Melody" instance in Visualswift into AUM successfully using a TX/RX in that direction on channel 2.

    I think you're on a good path towards making interesting layouts. I'd love to see a series of
    UI tools to augment Mozaic which has limited UI options and no capability to display TEXT since it doesn't support a "Character" data type. I tested Mozaic inside VisualSwift and I like
    the way it displays there and can be cabled to one or more AU Instruments. I might do some Mozaic script development inside your Visualswift environment since you handle the Mozaic full screen set up better than other AUv3 hosts.

  • edited January 2022

    Build 65: The mixer now shows the icons of the plugins used in each channel, tapping an icon maximises the corresponding plugin.

    Here's the same mixer after maximising it:

  • edited January 2022

    @McD said:
    I just took the TX/RX capability on channel 1 to pipe IAA "Funk Drummer" from AUM into VisualSwift. I couldn't get the Ableton Link to work for Start/Stop as it does but AUM also has Start/Stop for Ableton Link so maybe Funk Drummer connet=cted to AUM and that blocked VisualSwift.

    I need to revisit the Ableton Link implementation, I've spent very little time with it, it needs improving.

    @McD said:
    I think you're on a good path towards making interesting layouts. I'd love to see a series of
    UI tools to augment Mozaic which has limited UI options and no capability to display TEXT since it doesn't support a "Character" data type. I tested Mozaic inside VisualSwift and I like
    the way it displays there and can be cabled to one or more AU Instruments. I might do some Mozaic script development inside your Visualswift environment since you handle the Mozaic full screen set up better than other AUv3 hosts.

    Custom UIs for controlling existing plugins sounds like a great example to start exposing UI related components.

  • Build 66:
    AudioOut now displays a list of icons for all the plugins upstream of it. Each icon works as a shortcut for maximising the plugin.
    Resizing a component now clears any selected components first.

  • @Jorge said:
    I think the idea of MIDI TX and MIDI RX plugins is very good ( probably exists already ), they could work in a similar way to the Audio TX and Audio RX plugins.

    Thinking more about this today. If you were at some point able to get the MIDI send/receive working then it would be cool to be able to send MIDI directly to the Audio Rx instrument plugin as an option. The benefit would be that the Rx plugin could then be treated as if it were the instrument that was providing the audio. The MIDI would flow back to either the Tx side, or a separate MIDI Rx module and then directed to the instrument that is producing the audio. Here is an example use-case for using an IAA synth in a host that does not support IAA instruments:

    • Add Audio Rx as MIDI instrument in your host
    • In AUM, load up an IAA with a Tx instance capturing its output
    • Connect the MIDI-out of the Tx instance to the IAA
    • You should now be able to play MIDI directly into the host Audio Rx plugin (goes from Rx to Tx and then to IAA) and hear the IAA coming through as if the IAA were an AU plugin.

    There is no MIDI configuration whatsoever needed from the host to the IAA as that is handled inside AUM. This separation would let you use a more generic host template that is not tied to whichever IAA you are using for that session.

    Those familiar with Synthjacker would also see how this could be used to easily sample IAAs using its ability to sample directly from an AU :wink:

    Really impressed with the speed of the updates, I guess that speaks to the power of the framework you are using to build this stuff.

  • @MisplacedDevelopment said:

    @Jorge said:
    I think the idea of MIDI TX and MIDI RX plugins is very good ( probably exists already ), they could work in a similar way to the Audio TX and Audio RX plugins.

    Thinking more about this today. If you were at some point able to get the MIDI send/receive working then it would be cool to be able to send MIDI directly to the Audio Rx instrument plugin as an option. The benefit would be that the Rx plugin could then be treated as if it were the instrument that was providing the audio. The MIDI would flow back to either the Tx side, or a separate MIDI Rx module and then directed to the instrument that is producing the audio. Here is an example use-case for using an IAA synth in a host that does not support IAA instruments:

    • Add Audio Rx as MIDI instrument in your host
    • In AUM, load up an IAA with a Tx instance capturing its output
    • Connect the MIDI-out of the Tx instance to the IAA
    • You should now be able to play MIDI directly into the host Audio Rx plugin (goes from Rx to Tx and then to IAA) and hear the IAA coming through as if the IAA were an AU plugin.

    There is no MIDI configuration whatsoever needed from the host to the IAA as that is handled inside AUM. This separation would let you use a more generic host template that is not tied to whichever IAA you are using for that session.

    Those familiar with Synthjacker would also see how this could be used to easily sample IAAs using its ability to sample directly from an AU :wink:

    Really impressed with the speed of the updates, I guess that speaks to the power of the framework you are using to build this stuff.

    Thanks. That makes a lot of sense to me. The current "VisualSwift-RX" plugin would then become the "MIDI TX - Audio RX" or "Remote Instrument" plugin. I really like this feature, it will have to be done. Maybe the same could be done for "Remote Effect" ( Audio TX - Audio RX ) and "Remote MIDI Processor" ( MIDI TX - MIDI RX ) .

    As the framework matures the value to work ratio goes up a lot. I'm still holding on features and components until it is very solid in terms of stability. The Apple analytics still show me around 1 crash per 20 sessions. I never got DubFilter to work, I know how to change the code to make it work but the changes break many of the other plugins.

  • Hi @Jorge
    Sorry if his is already been made clear: can you (or anyone) take me through the steps of streaming audio from VisualSwift to a host (AUM for example). Including the order of opening the apps…
    I’ve lost track of changes since this feature since the app has grown so rapidly!
    I’m using the beat build 1.27, and using iOS pre-15. (Which may be the issue…)

  • edited January 2022

    Edit

  • edited January 2022

    @Littlewoodg said:
    Hi @Jorge
    Sorry if his is already been made clear: can you (or anyone) take me through the steps of streaming audio from VisualSwift to a host (AUM for example). Including the order of opening the apps…
    I’ve lost track of changes since this feature since the app has grown so rapidly!
    I’m using the beat build 1.27, and using iOS pre-15. (Which may be the issue…)

    Here's an example with one hosted TX Effect transmitting on channel 3 and one hosted RX Instrument receiving on the same channel.

    Both the TX and the RX in the example above are AUv3 plugins so you can do something similar in other hosts or even between them. The RX would be in the list of Instruments and the TX in the list of Effects.

    If you don't want the Transmitter audio to go through to the output set the THR slider to zero. The TX slider controls the volume of the transmitted audio.

    Here's the equivalent in AUM:

    I could be wrong but I'd expect it to work in iOS 14 too.

    The latest version is 1.2.7 build 66.

  • @Jorge said:

    @Littlewoodg said:
    Hi @Jorge
    Sorry if his is already been made clear: can you (or anyone) take me through the steps of streaming audio from VisualSwift to a host (AUM for example). Including the order of opening the apps…
    I’ve lost track of changes since this feature since the app has grown so rapidly!
    I’m using the beat build 1.27, and using iOS pre-15. (Which may be the issue…)

    Here's an example with one hosted TX Effect transmitting on channel 3 and one hosted RX Instrument receiving on the same channel.

    Both the TX and the RX in the example above are AUv3 plugins so you can do something similar in other hosts or even between them. The RX would be in the list of Instruments and the TX in the list of Effects.

    If you don't want the Transmitter audio to go through to the output set the THR slider to zero. The TX slider controls the volume of the transmitted audio.

    Here's the equivalent in AUM:

    I could be wrong but I'd expect it to work in iOS 14 too.

    The latest version is 1.2.7 build 66.

    Thank you bigtime!

    Is there a map for bring midi in from AUM at the same time as sending the audio to AUM?

  • @Littlewoodg said:
    Is there a map for bring midi in from AUM at the same time as sending the audio to AUM?

    I'll be working on MIDI TX and RX in the next days, just letting ideas settle a bit first.

  • edited January 2022

    I've received an anonymous TestFlight report from someone in Germany with constructive criticism.

    This kind of critical feedback tends to be really useful specially if I can get more details. I'd really like to pass this person's quality control. Please get in touch if you can and help me create the app you'd like to have, would be a win-win.

    I'd like to touch on some of the points, I'll paraphrase to keep anonymity.

    1) Multiple MIDI outputs not exposed.

    I can clearly see in the code why this is, I've assumed wrongly that when an instrument has MIDI output it has only one. This will be an easy fix. If you ( or someone else ) could provide the name of an example Plugin to use for testing would be great.

    2) Midi data gets lost.

    I can see a bit of code that could be the source of this problem. A specific example for testing would be great.

    3) Should allocate resources after connecting AudioUnit

    Thanks, that's quite specific and a very good hint where to look. Again, the name of a relevant plugin for testing would be great.

    4) connect/disconnect should not stop/start audio

    I've been very conservative in order to avoid crashes, at the moment each time the audio graph changes, the audio engine is stopped and restarted. With a bit of work it will be possible to pause it instead which means the playing audio wouldn't be disturbed.

    5) Windows are in the way.

    Not sure what you mean by this but anything UI based tends to be easy to change/fix. The chalenging part is related with AudioUnits. If you give me more details it's very likely I'd have this fixed quickly.

    6) Automatic screen resizing is not intuitive.

    Like point 5 with more details I'm confident I can fix this.

    7) Audio tails are cut away.

    The audio units generate and stream audio between them. I'm not doing anything to affect sound. This could be a case of the audio engine being in a wrong state because of issues similar to not allocating resources at the right time etc. I'd be interested in knowing if the issue is there straight away after a restart. Having a very specific example for testing would take me 80% of the way to a fix.

    EDIT: The anonymous TestFlight user got in touch and sent more details. I think the issues are a lot less serious than I initially thought with point 2 not being an issue at all.

  • I think you definitely need the screen estate to use this app to its full potential would be great on a 16 inch iPad Pro Max screen if they do hit the market this year. Finding the app a bit fiddly on a 9.7 inch but with Table top format that the nature of the beast.

  • edited January 2022

    @Jumpercollins said:
    I think you definitely need the screen estate to use this app to its full potential would be great on a 16 inch iPad Pro Max screen if they do hit the market this year. Finding the app a bit fiddly on a 9.7 inch but with Table top format that the nature of the beast.

    When you're creating a flat schematic I can see how you quickly feel the lack of screen estate. I hope this won't be much of a problem with the Module, Input and Output components which will allow schematics to be hierarchical. You'll then be able to separate your schematic into sections, each section inside a Module. The inputs and outputs used inside the module are exposed at the level above.

  • @Jorge
    I think you mentioned exposing UI construction elements in one of your previous posts.

    If users could build their own control surfaces capable of sending Midi CC. That would give your app a capability (IMO) not yet fully developed on the iPad platform.

    The concept:
    Many iPad Apps now provide Midi learn capability for parameter control.
    Stand-alone Drambo is the only AU host App I'm aware of that provides some capability for building control sets that can send Midi CC to adjust hosted AU parameters without opening the AU. But Drambo's control options are limited and not elegant.

    Having an AU host with the means to construct elegantly designed UI control surfaces with a full range of Knobs, Sliders, Switches, Rotary Switches, Buttons, Pads. etc.... With the capacity to arrange the components to create ergonomically pleasing layouts. Should theoretically provide a new way for users to work with multiple AU's on the iPad platform.

    The beneficial aspect of AU-Host level, User configurable, dedicated use controls surfaces... Is it optimizes the limited screen area on iPad, when for example a user is working with multiple loaded AU's, and it's not feasible to have all the AU's open on screen at the same time. In such case the user can create a custom control surface with control sets linked to the wanted parameters of each AU, arranged on the surface in a way that is intuitive for the user to operate.

    I understand it's your intention to provide the capability to construct and compile more than just control surfaces with Visual swift. But I'd agree that starting with control surfaces might provide users with an easier introduction to the App construction purpose of VisualSwift.. Not overwhelming users with a plethora features... Beginning with a feature set that should function as an introductory level into building with the VisualSwift paradigm.

    Providing an interesting new kind of host that provides AU control elements and connectivity features (that to the best of my knowledge are) not available in other Hosts.

  • @horsetrainer said:
    @Jorge
    I think you mentioned exposing UI construction elements in one of your previous posts.

    If users could build their own control surfaces capable of sending Midi CC. That would give your app a capability (IMO) not yet fully developed on the iPad platform.

    The concept:
    Many iPad Apps now provide Midi learn capability for parameter control.
    Stand-alone Drambo is the only AU host App I'm aware of that provides some capability for building control sets that can send Midi CC to adjust hosted AU parameters without opening the AU. But Drambo's control options are limited and not elegant.

    Having an AU host with the means to construct elegantly designed UI control surfaces with a full range of Knobs, Sliders, Switches, Rotary Switches, Buttons, Pads. etc.... With the capacity to arrange the components to create ergonomically pleasing layouts. Should theoretically provide a new way for users to work with multiple AU's on the iPad platform.

    The beneficial aspect of AU-Host level, User configurable, dedicated use controls surfaces... Is it optimizes the limited screen area on iPad, when for example a user is working with multiple loaded AU's, and it's not feasible to have all the AU's open on screen at the same time. In such case the user can create a custom control surface with control sets linked to the wanted parameters of each AU, arranged on the surface in a way that is intuitive for the user to operate.

    I understand it's your intention to provide the capability to construct and compile more than just control surfaces with Visual swift. But I'd agree that starting with control surfaces might provide users with an easier introduction to the App construction purpose of VisualSwift.. Not overwhelming users with a plethora features... Beginning with a feature set that should function as an introductory level into building with the VisualSwift paradigm.

    Providing an interesting new kind of host that provides AU control elements and connectivity features (that to the best of my knowledge are) not available in other Hosts.

    Thanks, that is indeed one of the features that I have in mind, it's great to see it described by someone else very much like I see it, works as validation and motivation. "Not overwhelming users with a plethora features" , that is the big challenge for me because I know it would just drive them away.

  • edited January 2022

    Build 67:
    You can now sync VisualSwift's transport to a channel.
    The VisualSwift-RX plugin now sends the host's beat position and play/stop state to the selected channel.
    The following screenshots show an example where Audio is being transmitted from VisualSwift through channel 7 to the RX plugin hosted inside AUM. The RX plugin is sending AUM's transport information back to VisualSwift.
    To sync VisualSwift to a channel, tap the bar:beat information in the navbar. When VisualSwift is being controlled by a channel the rewind/play buttons become hidden and a square with the channel number is displayed instead.

    This is an initial version for testing, it can be improved and suggestions are welcome.
    For example the RX plugin should maybe have a switch to enable or disable sending host transport information to the channel.
    I wonder if there are any advantages in allowing the Transport channel to be selected separately from the Audio.
    I haven't worried about performance yet, audio/beat position and start/stop are all being transmitted separately. If required, performance can be improved by combining for example beat position and start/stop in one transmission.
    There might be a sample precision mismatch of one buffer size ( I didn't check yet ), if found to be so it should be easy to fix.
    BPM is not being transmitted yet, I guess that should be next.
    I've just noticed the Sync To Channel checkbox has an xmark instead of a checkmark, I'll fix that for the next upload.

    EDIT:
    Build 68:
    The VisualSwift-RX plugin now sends the host's BPM through the selected channel to be picked up by VisualSwift's host app for playing in sync.
    Again there's lots of room for improving performance if required as the BPM information uses a separate channel and is transmitted at the same rate as audio buffers ( probably not necessary to update so often and BPM + Beat Position + Start/Stop should be combined in one transmission ).

    I'm not sure if it's useful to have 16 separate Transport channels.

  • Build 69:
    New MIDI-RX component.
    The following screenshots show an example of sending MIDI events from AUM to VSM through channel 2.

    MIDI events are currently not placed with sample precision inside the buffers, that will be done next.
    That means that it should be OK playing live but not sample precise with sequencers.

    The RX plugin now allows selecting what's transmitted through the channel, this should be useful for troubleshooting.
    I think with everything being transmitted there might be crackling until performance is improved. Performance improvements should come in the next builds.

  • Excellent news @Jorge , I have been looking forward to this update so I will be playing with this tonight!

  • Had a very quick look (could not help myself!), is it the case that MIDI RX is only a VisualSwift component at the moment, not a separate plugin?

  • edited January 2022

    @MisplacedDevelopment said:
    Had a very quick look (could not help myself!), is it the case that MIDI RX is only a VisualSwift component at the moment, not a separate plugin?

    With this build yes. But the current transmitting of MIDI events from the RX plugin to the channel will still be used when a generic MIDI receiver plugin gets implemented, so it's half way towards that.

    The idea is that you'll have all the types of plugins for sending and receiving all kinds of data to and from the channels.
    For example, with a future VisualSwift - MIDI Processor plugin ( could be called VS-MIDITX-MIDIRX ), the MIDI events arriving from the host will be transmitted to a selected channel and another selected channel will be used to output MIDI events to the host. You could then process and transform the MIDI events somewhere else which could be VisualSwift or another host.

    I'm not sure how useful this could be but maybe you or someone else will find some unexpected use-cases for it.

    I'm still trying to make sense of parameter data, if there is any advantage in transmitting and receiving it.

  • edited January 2022

    Build 70:
    New VisualSwift-MIDI-TX-RX MIDI Processor plugin.

    In the following example, MIDI events generated by midiSTEPs sequencer are transmitted to channel 4.
    Both MIDI-TX-RX plugins are receiving on channel 4 which causes the midiSTEPs events to flow out to Pure Synth and Pure Acid.

    MIDI Events are still not sample precise, that will be done when everything else is working in order to keep potential bugs easy to troubleshoot.

    The VisualSwift-MIDI-TX-RX plugin is an AUv3 MIDI Processor and should work in other hosts too and also between two different hosts which is the main point of it.

  • Just in time for me to start playing! You work so fast.

    When I add MIDI-TX-RX it does not show up in the routing of AUM as a MIDI source. My first test is to go from a DAW track with Rx as the instrument, transmitting MIDI on chl 1. In AUM I have an IAA going in to a Tx. The goal is to control the IAA from the DAW using only your plugins. If I add MIDI-TX-RX in AUM so that it receives the MIDI from the Rx in the DAW there does not seem to be a way to route this to the IAA in AUM, only to another instance of the plugin or VS itself. Am I missing a step?

  • edited January 2022

    @MisplacedDevelopment said:
    Just in time for me to start playing! You work so fast.

    When I add MIDI-TX-RX it does not show up in the routing of AUM as a MIDI source. My first test is to go from a DAW track with Rx as the instrument, transmitting MIDI on chl 1. In AUM I have an IAA going in to a Tx. The goal is to control the IAA from the DAW using only your plugins. If I add MIDI-TX-RX in AUM so that it receives the MIDI from the Rx in the DAW there does not seem to be a way to route this to the IAA in AUM, only to another instance of the plugin or VS itself. Am I missing a step?

    Maybe it's me that's missing a step. To be honest I have almost zero experience with other hosts. My experience only goes as far as inserting an Instrument and an Effect in AUM. I didn't try the new plugin in AUM as I don't know how to insert a MIDI Processor inside it.

    I've implemented this plugin as a MIDI Processor, in the code I use inside VisualSwift to find plugins, when I search for MIDI Processors it finds all the available ones. Maybe there's some flag I need to set inside the plugin to notify the host that it receives MIDI. I assumed that a MIDI Processor by definition receives MIDI, processes it and sends MIDI and hosts would always know they have those capabilities.

    So, I guess the test I now need to pass is to make it appear in the AUM routing as a source.
    I need to think, I've never used IAA either. Might try to replicate what you describe and try to make sense of it ( even though you describe it well ).

    I wonder if I make the RX plugin receive MIDI and output it if it would solve your issue. I'm assuming instruments appear in AUM as MIDI sources if they support it.

  • @Jorge said:

    @MisplacedDevelopment said:
    Just in time for me to start playing! You work so fast.

    When I add MIDI-TX-RX it does not show up in the routing of AUM as a MIDI source. My first test is to go from a DAW track with Rx as the instrument, transmitting MIDI on chl 1. In AUM I have an IAA going in to a Tx. The goal is to control the IAA from the DAW using only your plugins. If I add MIDI-TX-RX in AUM so that it receives the MIDI from the Rx in the DAW there does not seem to be a way to route this to the IAA in AUM, only to another instance of the plugin or VS itself. Am I missing a step?

    Maybe it's me that's missing a step. To be honest I have almost zero experience with other hosts. My experience only goes as far as inserting an Instrument and an Effect in AUM. I didn't try the new plugin in AUM as I don't know how to insert a MIDI Processor inside it.

    I've implemented this plugin as a MIDI Processor, in the code I use inside VisualSwift to find plugins, when I search for MIDI Processors it finds all the available ones. Maybe there's some flag I need to set inside the plugin to notify the host that it receives MIDI. I assumed that a MIDI Processor by definition receives MIDI, processes it and sends MIDI and hosts would always know they have those capabilities.

    So, I guess the test I now need to pass is to make it appear in the AUM routing as a source.
    I need to think, I've never used IAA either. Might try to replicate what you describe and try to make sense of it ( even though you describe it well ).

    I wonder if I make the RX plugin receive MIDI and output it if it would solve your issue. I'm assuming instruments appear in AUM as MIDI sources if they support it.

    I do not know how to set up a plugin as capable of sending MIDI I am afraid. Is there a MIDI effect category? Hopefully someone more knowledgable will be able to chip in with advice.

    It does not have to be an IAA to test with, the MIDI routing will be the same if the source instrument is an AUv3. I was using an IAA since being able to control an IAA directly from a track in a DAW that does not otherwise support IAAs was an interesting use-case to test with.

    Having the MIDI Rx/Tx in one plugin saves having two separate Tx and Rx plugins but you may need a way of setting the channel for each of Tx and Rx to ”none” so that you can use it as a:

    • transmitter (Rx channel set to none, but can still transmit MIDI routed to it from the host)
    • receiver (Tx channel set to none, but can still forward the MIDI it receives internally to the host)
    • bridge (as it is now where you set both Tx and Rx channels and the MIDI is routed via the internal VS channels. In this mode you may want to limit whether MIDI is sent to the host and any MIDI sent to it from the host is ignored?)

    It should be possible for the Rx instrument to generate MIDI. The free StreamByter app is an example of an instrument that appears in the AUM routing panel as a MIDI source.

  • @MisplacedDevelopment said:

    @Jorge said:

    @MisplacedDevelopment said:
    Just in time for me to start playing! You work so fast.

    When I add MIDI-TX-RX it does not show up in the routing of AUM as a MIDI source. My first test is to go from a DAW track with Rx as the instrument, transmitting MIDI on chl 1. In AUM I have an IAA going in to a Tx. The goal is to control the IAA from the DAW using only your plugins. If I add MIDI-TX-RX in AUM so that it receives the MIDI from the Rx in the DAW there does not seem to be a way to route this to the IAA in AUM, only to another instance of the plugin or VS itself. Am I missing a step?

    Maybe it's me that's missing a step. To be honest I have almost zero experience with other hosts. My experience only goes as far as inserting an Instrument and an Effect in AUM. I didn't try the new plugin in AUM as I don't know how to insert a MIDI Processor inside it.

    I've implemented this plugin as a MIDI Processor, in the code I use inside VisualSwift to find plugins, when I search for MIDI Processors it finds all the available ones. Maybe there's some flag I need to set inside the plugin to notify the host that it receives MIDI. I assumed that a MIDI Processor by definition receives MIDI, processes it and sends MIDI and hosts would always know they have those capabilities.

    So, I guess the test I now need to pass is to make it appear in the AUM routing as a source.
    I need to think, I've never used IAA either. Might try to replicate what you describe and try to make sense of it ( even though you describe it well ).

    I wonder if I make the RX plugin receive MIDI and output it if it would solve your issue. I'm assuming instruments appear in AUM as MIDI sources if they support it.

    I do not know how to set up a plugin as capable of sending MIDI I am afraid. Is there a MIDI effect category? Hopefully someone more knowledgable will be able to chip in with advice.

    It does not have to be an IAA to test with, the MIDI routing will be the same if the source instrument is an AUv3. I was using an IAA since being able to control an IAA directly from a track in a DAW that does not otherwise support IAAs was an interesting use-case to test with.

    Having the MIDI Rx/Tx in one plugin saves having two separate Tx and Rx plugins but you may need a way of setting the channel for each of Tx and Rx to ”none” so that you can use it as a:

    • transmitter (Rx channel set to none, but can still transmit MIDI routed to it from the host)
    • receiver (Tx channel set to none, but can still forward the MIDI it receives internally to the host)
    • bridge (as it is now where you set both Tx and Rx channels and the MIDI is routed via the internal VS channels. In this mode you may want to limit whether MIDI is sent to the host and any MIDI sent to it from the host is ignored?)

    It should be possible for the Rx instrument to generate MIDI. The free StreamByter app is an example of an instrument that appears in the AUM routing panel as a MIDI source.

    Thanks for the very detailed information. From what you say here's my conclusions:

    1 - IAA can be left out of the equation which is one less thing I need to think about.
    2 - I will enhance the MIDI Rx/Tx plugin to allow enabling/disabling of Tx/Rx.
    3 - I will enhance the existing Audio Rx plugin to allow receiving MIDI from a channel and send it out to the host. I'll test it by checking if it appears in the AUM routing panel.

    I hope that will sort all your use-cases, let me know if you have other suggestions.

  • @Jorge said:

    Thanks for the very detailed information. From what you say here's my conclusions:

    1 - IAA can be left out of the equation which is one less thing I need to think about.
    2 - I will enhance the MIDI Rx/Tx plugin to allow enabling/disabling of Tx/Rx.
    3 - I will enhance the existing Audio Rx plugin to allow receiving MIDI from a channel and send it out to the host. I'll test it by checking if it appears in the AUM routing panel.

    I hope that will sort all your use-cases, let me know if you have other suggestions.

    Thanks Jorge, I believe that should do it.

    From a usability POV - are the VS channel numbers mapped to actual MIDI channels or are they just the internal route labels? If the latter then it may be a good idea to assign them letters otherwise people will confuse them with MIDI channel numbers.

  • edited January 2022

    @MisplacedDevelopment said:

    @Jorge said:

    Thanks for the very detailed information. From what you say here's my conclusions:

    1 - IAA can be left out of the equation which is one less thing I need to think about.
    2 - I will enhance the MIDI Rx/Tx plugin to allow enabling/disabling of Tx/Rx.
    3 - I will enhance the existing Audio Rx plugin to allow receiving MIDI from a channel and send it out to the host. I'll test it by checking if it appears in the AUM routing panel.

    I hope that will sort all your use-cases, let me know if you have other suggestions.

    Thanks Jorge, I believe that should do it.

    From a usability POV - are the VS channel numbers mapped to actual MIDI channels or are they just the internal route labels? If the latter then it may be a good idea to assign them letters otherwise people will confuse them with MIDI channel numbers.

    The latter, very good suggestion, thanks, I'll do that so there's no confusion.

    I have thought of having a way to give them your own custom names but maybe it's a bit overkill and would introduce quite a lot of opportunities for bugs.

Sign In or Register to comment.