Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VS - Visual Synthesizer by Imaginando Lda

17810121331

Comments

  • @sinosoidal said:

    @janpieter said:

    @sinosoidal said:

    @Toastedghost said:
    More experiments.
    Der and I keep writing over the default. Would be good if one button could clear all settings and materials leaving a blank canvass. Anyone considered the posibility of importing gifs on the background layer?

    We have just added support to gif and webp. Do you wanna join the beta and give it a go?

    great!
    exclusively background or also possible for other layers?

    Right now, only for background! This is still version 1.0.1! :lol:

    Is adding stuff to other layers planned on the roadmap ? Tempted to buy the desktop app but need much more customization for me to pull the trigger. Also. Eventual custom shades etc ?

  • Another experiment

  • @NimboStratus said:
    Just a quick test to show the changing preset in action

    YouTube still processing the HD version.

    Nice one. Use Vimeo instead of Youtube no massive process time to wait.

  • @Toastedghost said:

    >
    Nice one. Use Vimeo instead of Youtube no massive process time to wait.

    Thought you had to pay to upload to vimeo

  • @NimboStratus said:

    @Toastedghost said:

    >
    Nice one. Use Vimeo instead of Youtube no massive process time to wait.

    Thought you had to pay to upload to vimeo

    You have a total of 5gb that you can upload on the free version. There may also be a weekly limit. But heyho works much better for me.

  • @Toastedghost said:
    Another experiment

    That’s excellent. The eye treatment is quite clever!

  • @NimboStratus thanks for the detailed explanation and cool resulting video.

  • @Toastedghost said:
    Another experiment

    Dude! That’s frickin awesome 🤩

  • So that’s a background picture imported into VS right?

    I’ve spent like a total of 5 minutes with this app so far and my mind is blown. “I like this!” (Said out loud with a really exaggerated southern draw)

  • @sinosoidal maybe I got lost here on the discussions and someone already mentioned it. I would use it with my own stuff, and know the tempo. But I see a benefit for those vjing for other people and needing a Tap Tempo on the app.

  • @wim said:

    @skiphunt said:
    I’m really enjoying this app so far… but I have a dumb question. What is the difference between loading VS as a music effect versus loading it as an instrument?

    I imagine that if you use it as an FX that you can pass audio in addition to midi through it to affect the visuals.

    I thought that was the case... but it seems to behave the same regardless of whether I have it loaded as an instrument OR FX.

    Maybe I'm missing something though. I played for a bit last night and I couldn't tell the difference between loaded as FX vs Instrument.

    I thought that maybe you could use the LFOs, etc. that you set up within VS to effect other other sounds/instruments outside of VS. But I've yet to figure out if you can actually do that yet @sinosoidal

  • @wim said:

    @wim said:
    Hey, is there any point in trying this on an iPad Air 2?

    It looks great, but no iPad upgrades are in the cards for me for awhile.

    Well glory be, it holds up pretty good even on this poor ol’ Air 2! B)
    Damn this app is fun. This is the one I’ve been waiting for. <3
    Amazing work!

    Yeah it’s pretty efficient, I’m using it on an Air 1 and it’s still capable if I don’t go too nuts on layers

  • VS was updated to 1.0.1. Here is the changelog:

    • Add undo/redo to paste layer operation
    • Add undo/redo to layer/backgroun reset
    • Add resources browser option to background layer context menu
    • Add support to gif and webp (Giphy files)
    • Fixed bug that would cause enabled layers to appear as disabled
    • General bug fixes and improvements

    If you like VS please rate it on App Store. :blush:

    https://apps.apple.com/us/app/vs-visual-synthesizer/id1560330289

  • edited June 2021

    thank you @NimboStratus for your directions
    now able to change patterns with different background videos with MIDI notes (as you can see the videos don't connect seamlessly - the MIDI notes do - there's some black space in between)
    note: video quality is good (first upload twitter has somehow lowered it, maybe will get better)

  • @janpieter said:
    thank you @NimboStratus for your directions
    now able to change patterns with different background videos with MIDI notes (as you can see the videos don't connect seamlessly - the MIDI notes do - there's some black space in between)

    Because it is loading a new video every time. It takes time to start the playback. But its cool! Even with the black space. I will need to find a solution for this requirement.

  • @sinosoidal said:

    @janpieter said:
    thank you @NimboStratus for your directions
    now able to change patterns with different background videos with MIDI notes (as you can see the videos don't connect seamlessly - the MIDI notes do - there's some black space in between)

    Because it is loading a new video every time. It takes time to start the playback. But its cool! Even with the black space. I will need to find a solution for this requirement.

    Agree! Not meant as a comment.. But of course nice if that would be possible.
    I'm happy with the possibility to use different background videos as is now.

  • @skiphunt said:

    @wim said:

    @skiphunt said:
    I’m really enjoying this app so far… but I have a dumb question. What is the difference between loading VS as a music effect versus loading it as an instrument?

    I imagine that if you use it as an FX that you can pass audio in addition to midi through it to affect the visuals.

    I thought that was the case... but it seems to behave the same regardless of whether I have it loaded as an instrument OR FX.

    Maybe I'm missing something though. I played for a bit last night and I couldn't tell the difference between loaded as FX vs Instrument.

    I thought that maybe you could use the LFOs, etc. that you set up within VS to effect other other sounds/instruments outside of VS. But I've yet to figure out if you can actually do that yet @sinosoidal

    You need to "connect" the incoming sound to parameters for it to do anything.

    The first thing to do is to set the threshold for the envelope trigger. The arrow at the bottom left brings the modulators up. On the right you should see a spectrograph of the incoming signal. Set the threshold so that it's just getting crossed when the sound peaks. The envelope fires whenever the amplitude of the incoming sound crosses the threshold. Just adjust the first of the four envelopes for this example.

    Now you need to connect the envelope to what you want to modulate. I've found the easiest way to do this is to long-press one of the controls on any of the layers except the base layer, which can't be modulated. You'll get a matrix showing that control and a few others. This is where you can set how much each modulator affects each control. Apply some amount of AM1 (Audio Modulator 1) to some parameters, and you should see the effect on the animation.

    Described much better here: https://www.imaginando.pt/products/vs/help/layer-modulations

  • Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    @wim said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    I’m really enjoying this app so far… but I have a dumb question. What is the difference between loading VS as a music effect versus loading it as an instrument?

    I imagine that if you use it as an FX that you can pass audio in addition to midi through it to affect the visuals.

    I thought that was the case... but it seems to behave the same regardless of whether I have it loaded as an instrument OR FX.

    Maybe I'm missing something though. I played for a bit last night and I couldn't tell the difference between loaded as FX vs Instrument.

    I thought that maybe you could use the LFOs, etc. that you set up within VS to effect other other sounds/instruments outside of VS. But I've yet to figure out if you can actually do that yet @sinosoidal

    You need to "connect" the incoming sound to parameters for it to do anything.

    The first thing to do is to set the threshold for the envelope trigger. The arrow at the bottom left brings the modulators up. On the right you should see a spectrograph of the incoming signal. Set the threshold so that it's just getting crossed when the sound peaks. The envelope fires whenever the amplitude of the incoming sound crosses the threshold. Just adjust the first of the four envelopes for this example.

    Now you need to connect the envelope to what you want to modulate. I've found the easiest way to do this is to long-press one of the controls on any of the layers except the base layer, which can't be modulated. You'll get a matrix showing that control and a few others. This is where you can set how much each modulator affects each control. Apply some amount of AM1 (Audio Modulator 1) to some parameters, and you should see the effect on the animation.

    Described much better here: https://www.imaginando.pt/products/vs/help/layer-modulations

  • Another test, sorry use the same track as before with a slight update.

  • wimwim
    edited June 2021

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

  • @Toastedghost said:
    Another test, sorry use the same track as before with a slight update.

    Wicked.

    Loved what you've done with the materials within the eyes.

    Yeah.
    Awesome.

  • @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

  • @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

  • @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    Using MIDI out to easily record parameter tweaking (in Xequence for instance) and be able to play it back? Sort of straightforward automation recording/playback.

  • @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    A few things that come to mind...
    Send the LFO out to control a filter sweep in sync with a video modulation. Trigger volume swells in sync with an audio modulation envelope. Record the modulators to play back into the app so that you can remove the audio input...

    But more importantly, if midi out isn’t used, then the options shouldn’t be there or it leads to wasted time learning that it doesn’t do anything, bug reports, confusion, etc.

  • @wim said:

    @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    A few things that come to mind...
    Send the LFO out to control a filter sweep in sync with a video modulation. Trigger volume swells in sync with an audio modulation envelope. Record the modulators to play back into the app so that you can remove the audio input...

    But more importantly, if midi out isn’t used, then the options shouldn’t be there or it leads to wasted time learning that it doesn’t do anything, bug reports, confusion, etc.

    I guess it's maybe not essential as most (or all?) things can also be achieved otherwise?

  • wimwim
    edited June 2021

    @janpieter said:

    @wim said:

    @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    A few things that come to mind...
    Send the LFO out to control a filter sweep in sync with a video modulation. Trigger volume swells in sync with an audio modulation envelope. Record the modulators to play back into the app so that you can remove the audio input...

    But more importantly, if midi out isn’t used, then the options shouldn’t be there or it leads to wasted time learning that it doesn’t do anything, bug reports, confusion, etc.

    I guess it's maybe not essential as most (or all?) things can also be achieved otherwise?

    Yep. It’s just odd to have an option exist that doesn’t do anything.

  • This app is awesome. I just tried using the iPad app hooked up to my Mac thru IDAM with ableton live (linked) sending it midi easily and keeping the iPad in full screen. It works great actually…

  • @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    Yes, it's somewhat the same in apeMatrix. If I want to route audio into VS, it has to be loaded as a music effect. However, if my input is midi... it doesn't matter which version of VS is loaded, ie. music effect vs instrument.

    My question is... when would I want to load VS as an instrument vs a music instrument?

    Regarding midi out... first of all, I was just trying to figure out if it worked and what's it's for since it's there.

    Secondly, I can see an interesting use for midi out. Normal use of VS is to create audio and/or midi that will effect various graphic layers within VS. I can see the opposite use where you are creating graphics and LFOs that effect external sounds.

    In other words, right now the audio source is the master, and the graphics are the slave. If midi out worked, the graphics could be master and the audio would be slave.

  • @skiphunt said:

    @sinosoidal said:

    @skiphunt said:

    @wim said:

    @skiphunt said:
    Yes. Thanks. However, I've already figured out all of this... I've pretty much got a complete handle on how the app works in stand-alone AND as an AU.

    I'm just saying that when I load VS as an instrument OR music effect in apeMatrix... it behaves pretty much the same regardless. All of the midi and/or sound reactivity is the same regardless of which version I'm using. There's got to be a good reason why there are separate versions though. They both respond to all the same input in the same way. I can't tell a difference between either of them.

    I didn't realize you were using Ape Matrix. In AUM, in order to route audio to something it has to be in an FX slot. Maybe it makes no difference in Ape Matrix. The designations of whether or not something is an FX are just part of the Apple app type identifier; unless the programmer actually writes code to limit the functionality based on the type identifier, the versions would normally act the same. Most hosts depend on the app type id to decide where an app can go. Maybe Ape Matrix doesn't care.

    Also... the settings in VS show that it has midi out activated. But, I haven't figured out how to get anything outside of VS to react to the midi out that it's sending. Have you?

    I don't see any midi coming out from either the AUv3 or the standalone. I'm not sure what I'd use it for, but seems like it could be useful for something if it works.

    That’s probably it.

    And yes… if midi out really does work that could be interesting to play with

    I'm not used to ApeMatrix, but on AUM you wont get audio input unless you declare it as an audio effect, route all the tracks to a bus and use the bus as input of the audio track where VS is inserted as FX. Maybe ApeMatrix overrides this burocracy.

    We are not using the midi out and we can't imagine a use for it right now. Can anyone foresee a use for it?

    Yes, it's somewhat the same in apeMatrix. If I want to route audio into VS, it has to be loaded as a music effect. However, if my input is midi... it doesn't matter which version of VS is loaded, ie. music effect vs instrument.

    My question is... when would I want to load VS as an instrument vs a music instrument?

    It could be usefull if you just want to feed it midi and not audio. Or if you just want to give it a spin very quickly. Taking that option out will make it harder to find in the list.

    Regarding midi out... first of all, I was just trying to figure out if it worked and what's it's for since it's there.

    Secondly, I can see an interesting use for midi out. Normal use of VS is to create audio and/or midi that will effect various graphic layers within VS. I can see the opposite use where you are creating graphics and LFOs that effect external sounds.

    In other words, right now the audio source is the master, and the graphics are the slave. If midi out worked, the graphics could be master and the audio would be slave.

    Generating midi from the visuals seems a bit far-fetched. Maybe it is better to take that option out to avoid further confusing. :blush:

Sign In or Register to comment.