Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VS - Visual Synthesizer by Imaginando Lda

1242526272830»

Comments

  • @joaehun said:
    Hi, I've scrolled a bit through this thread but didnt find my answer:
    If I run an instance of AUv3 VS in Loopy Pro what are the routing possibilities of Colour Groups to VS channels for a live performance?
    And would I be able to project this output to an external monitor or projector directly from my iPad?
    Thanks all from Imaginando for the good work!

    VS is an effect, so you could add it on a bus and have bus sends from your color groups. You could adjust the bus send levels to control which color groups were feeding it.

    Something like this:


  • ah yes I see. I like that. But I've contacted them and apparently the output would not be possible to stream live in terms of a live performance?
    Would it be possible to open it in a different 'screen' and project that using sidecar to a projector or another screen?


  • I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

  • @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

  • @Polyphonix said:

    @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

    Thanks so much for the response. I can’t figure it out for the life of me.. midi learn on + touch anything in the matrix + twist knob cc 15 (from controller) + midi learn off = nope

    I can get midi learn to work for the layer controls with out an issue (same steps as above), but when trying to map anything in the LFO matrix sadly nothing fires

  • edited December 2023

    @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

    Thanks so much for the response. I can’t figure it out for the life of me.. midi learn on + touch anything in the matrix + twist knob cc 15 (from controller) + midi learn off = nope

    I can get midi learn to work for the layer controls with out an issue (same steps as above), but when trying to map anything in the LFO matrix sadly nothing fires

    If you use VS stand alone you can only apply midi learn to the 16 parameters directly below the layer buttons. To learn a value within the matrix, you would always have to select two parameters simultaneously with your finger in order to later address the value at the intersection of the two parameters with a controller. With midi learn, however, only one button can be taught at a time. It won't work the way you‘d like to do it in the VS or also every other matrix.

    So to get direct access to each individual value via midi learn you have to use a host such as AUM which exposes all parameters and their corresponding values.

    Edit:
    To make 'parameter' more precise and less misleading: I mean the midi learn modulators LFO, EG, AM, KBD and VEL and on the other side of the matrix coordinate system the attributes Hue, Sat, Brightness and Alpha. The value to be manipulated results from the intersection point within the XY matrix. Due to the technical limitations of midi learn, it is not possible to apply it to the modulator and attribute at the same time in stand-alone mode. You have to use a host to learn the parameters exposed there.

  • @Polyphonix said:

    @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

    Thanks so much for the response. I can’t figure it out for the life of me.. midi learn on + touch anything in the matrix + twist knob cc 15 (from controller) + midi learn off = nope

    I can get midi learn to work for the layer controls with out an issue (same steps as above), but when trying to map anything in the LFO matrix sadly nothing fires

    If you use VS stand alone you can only apply midi learn to the 16 parameters directly below the layer buttons. To learn a value within the matrix, you would always have to select two parameters simultaneously with your finger in order to later address the value at the intersection of the two parameters with a controller. With midi learn, however, only one button can be taught at a time. It won't work the way you‘d like to do it in the VS or also every other matrix.

    So to get direct access to each individual value via midi learn you have to use a host such as AUM which exposes all parameters and their corresponding values.

    Edit:
    To make 'parameter' more precise and less misleading: I mean the midi learn modulators LFO, EG, AM, KBD and VEL and on the other side of the matrix coordinate system the attributes Hue, Sat, Brightness and Alpha. The value to be manipulated results from the intersection point within the XY matrix. Due to the technical limitations of midi learn, it is not possible to apply it to the modulator and attribute at the same time in stand-alone mode. You have to use a host to learn the parameters exposed there.

    Thank you SO much!!… now I need to try as an AU I guess.. I am trying to use it through a projector for visuals durning my show (VS and projector on an old iPad.. good new iPad has all my synths and midi being sent to old iPad)

    the only downfall to use as an AU for my use, is you can’t use the “pop-out” full screen function (for a projector)

    So external LFOs I guess would be my next step (more complex routing with my already complex set up).. that being said what 3rd party LFO is your goto? I have MLFO but the 16 outputs In AUMs midi matrix gets thing pretty crazy. Might you know of any with selectable outputs? (Or just one output)

  • @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

    Thanks so much for the response. I can’t figure it out for the life of me.. midi learn on + touch anything in the matrix + twist knob cc 15 (from controller) + midi learn off = nope

    I can get midi learn to work for the layer controls with out an issue (same steps as above), but when trying to map anything in the LFO matrix sadly nothing fires

    If you use VS stand alone you can only apply midi learn to the 16 parameters directly below the layer buttons. To learn a value within the matrix, you would always have to select two parameters simultaneously with your finger in order to later address the value at the intersection of the two parameters with a controller. With midi learn, however, only one button can be taught at a time. It won't work the way you‘d like to do it in the VS or also every other matrix.

    So to get direct access to each individual value via midi learn you have to use a host such as AUM which exposes all parameters and their corresponding values.

    Edit:
    To make 'parameter' more precise and less misleading: I mean the midi learn modulators LFO, EG, AM, KBD and VEL and on the other side of the matrix coordinate system the attributes Hue, Sat, Brightness and Alpha. The value to be manipulated results from the intersection point within the XY matrix. Due to the technical limitations of midi learn, it is not possible to apply it to the modulator and attribute at the same time in stand-alone mode. You have to use a host to learn the parameters exposed there.

    Thank you SO much!!… now I need to try as an AU I guess.. I am trying to use it through a projector for visuals durning my show (VS and projector on an old iPad.. good new iPad has all my synths and midi being sent to old iPad)

    the only downfall to use as an AU for my use, is you can’t use the “pop-out” full screen function (for a projector)

    So external LFOs I guess would be my next step (more complex routing with my already complex set up).. that being said what 3rd party LFO is your goto? I have MLFO but the 16 outputs In AUMs midi matrix gets thing pretty crazy. Might you know of any with selectable outputs? (Or just one output)

    In fact, limiting midi learn's access to all available parameters via a host prevents immersive real time full screen live performance.
    @sinosoidal I think you must be pretty busy with BAM right now. But is there a chance that this gap will be closed in the future and that all parameters can be mapped and performed full screen with external midi controllers?

    My goto is mLFO but yeah, it‘s a bit messy in AUMs matrix. You could take a look at midiLFOs. I don't have it so I don't know if it will help you with your requirements but the description in the store promises 'Designed to be easy to configure and use...'
    https://apps.apple.com/de/app/midilfos-midi-modulator/id998273841

  • edited December 2023

    @Polyphonix said:

    @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    @Polyphonix said:

    @BongoJak said:

    I’ve search this forum and all over the manual, hopfully I missed it.. is it possible to midi map the “LFOs” in the matrix? (From an external controller)

    Yes. All of VS‘s parameters are exposed so you can midi map anything to your desire.

    Thanks so much for the response. I can’t figure it out for the life of me.. midi learn on + touch anything in the matrix + twist knob cc 15 (from controller) + midi learn off = nope

    I can get midi learn to work for the layer controls with out an issue (same steps as above), but when trying to map anything in the LFO matrix sadly nothing fires

    If you use VS stand alone you can only apply midi learn to the 16 parameters directly below the layer buttons. To learn a value within the matrix, you would always have to select two parameters simultaneously with your finger in order to later address the value at the intersection of the two parameters with a controller. With midi learn, however, only one button can be taught at a time. It won't work the way you‘d like to do it in the VS or also every other matrix.

    So to get direct access to each individual value via midi learn you have to use a host such as AUM which exposes all parameters and their corresponding values.

    Edit:
    To make 'parameter' more precise and less misleading: I mean the midi learn modulators LFO, EG, AM, KBD and VEL and on the other side of the matrix coordinate system the attributes Hue, Sat, Brightness and Alpha. The value to be manipulated results from the intersection point within the XY matrix. Due to the technical limitations of midi learn, it is not possible to apply it to the modulator and attribute at the same time in stand-alone mode. You have to use a host to learn the parameters exposed there.

    Thank you SO much!!… now I need to try as an AU I guess.. I am trying to use it through a projector for visuals durning my show (VS and projector on an old iPad.. good new iPad has all my synths and midi being sent to old iPad)

    the only downfall to use as an AU for my use, is you can’t use the “pop-out” full screen function (for a projector)

    So external LFOs I guess would be my next step (more complex routing with my already complex set up).. that being said what 3rd party LFO is your goto? I have MLFO but the 16 outputs In AUMs midi matrix gets thing pretty crazy. Might you know of any with selectable outputs? (Or just one output)

    In fact, limiting midi learn's access to all available parameters via a host prevents immersive real time full screen live performance.
    @sinosoidal I think you must be pretty busy with BAM right now. But is there a chance that this gap will be closed in the future and that all parameters can be mapped and performed full screen with external midi controllers?

    My goto is mLFO but yeah, it‘s a bit messy in AUMs matrix. You could take a look at midiLFOs. I don't have it so I don't know if it will help you with your requirements but the description in the store promises 'Designed to be easy to configure and use...'
    https://apps.apple.com/de/app/midilfos-midi-modulator/id998273841

    @Polyphonix you ROCK! Thanks so much for your help!!

    And yep just as you said mapping the LFO matrix works fine as AU (but I want that full screen so bad!)

    Edit… new approach… IPad1 (main) sends CCs to AUM on iPad2 (VS).. iPad2 AUM has MLFO to VS standalone… here we go!! (Less stuff in main ipad1 AUM matrix)… whew… but I wish this thing would follow start and stop commands (standalone.. oh well)

  • edited December 2023

    @Polyphonix you ROCK! Thanks so much for your help!!

    And yep just as you said mapping the LFO matrix works fine as AU (but I want that full screenso bad!)

    Edit… new approach… IPad1 (main) sends CCs to AUM on iPad2 (VS).. iPad2 AUM has MLFO to VS standalone… here we go!! (Less stuff in main ipad1 AUM matrix)… whew… but I wish this thing would follow start and stop commands (standalone.. oh well)

    Near the Midi Learn button of the stand alone instance, set Clock Source to ‘Link’ and start/stop follow (of its LFO) is good to go.

  • @Polyphonix said:

    @Polyphonix you ROCK! Thanks so much for your help!!

    And yep just as you said mapping the LFO matrix works fine as AU (but I want that full screenso bad!)

    Edit… new approach… IPad1 (main) sends CCs to AUM on iPad2 (VS).. iPad2 AUM has MLFO to VS standalone… here we go!! (Less stuff in main ipad1 AUM matrix)… whew… but I wish this thing would follow start and stop commands (standalone.. oh well)

    Near the Midi Learn button of the stand alone instance, set Clock Source to ‘Link’ and start/stop follow (of its LFO) is good to go.

    I’ve been down that hole, lol… I was talking about the actual stop button in standalone (next to record) to follow start and stop with ableton link clock (nope)..

    And I tried my other route, I can only use 1 LFO per CC.. sending two LFOs with the same CC doesn’t do what the matrix does… also the LFO matrix allows for more control like hue and saturation…

    I’ll think I can do something and nope.. all day with this… well at least I know the limitations now.. again appreciate your help! Here goes PC presets for everything I guess

  • @BongoJak After looking at VS from the perspective of live performance as a result of your question here in the forum, I have to realize that it was probably not designed for that in the first place. The features and performance of Synesthesia or Resolume cannot be demanded at this price tag, that would have been too good to be true.

    As a workaround, you could sacrifice the brightness attribute by placing all 8 layers on one fader and fading them in very quickly in manual sync. But this is admittedly a pretty miserable crutch. Btw, even in AUv3 mode you can activate the fullscreen mode via the pop out button to the right of the fullscreen button if you have an external display/projector connected.

    I think it would be better to address your requirements directly to the guys at Imaginando and keep us up to date on their future VS intensions in the forum. Cheers!

    • I enjoy VS and… I’m wondering about alternatives.

    My use case is for very simple music visualizations. Basically, I want to create videos from short recordings I’m sharing on social media. Apart from dedicated services, most social media sites don’t allow me to just upload sound files. Besides, I notice that I get better interactions when I share some kind of video file which relates to the music itself without representing the way I created the recording. I’m not trying to generate “engagement” in the typical social media sense. It’s just about sharing my silly experiments with acquaintances and friends.

    VS does work pretty well for this once I set things up properly. Which I only do after I’m done with my exploratory musicking. By that point, I just want the process to be as quick as possible.

    Ideally, what I’d like would be as easy to use as VS, if not more. And it wouldn’t require a subscription.
    In terms of price, I might be ok with pay-per-use. Or “desktop plugin price”. I don’t care if it’s on iPadOS, on macOS, or on the web.

    There’s a couple of online services which allow you to upload a sound file and generate a decent video. Either their business models are offputting or the workflow is more involved than using VS.

    And there are very elaborate options, both commercial and Free/Libre Open Source Software. Someone who’s motivated enough can do wonders with the aforementioned Resolume… or with Cables.gl, TouchDesigner, p5js, Houdini, Max/Vizzie, Isadora, etc.
    Did dabble in the F/LOSS ones, limitations in TD’s “Community Edition” don’t bother me, and I own a Max license. It’s just that… I probably need enough motivation to get into any of these.

    Other options I’ve used include screencaptures (either on iPadOS or through OBS on macOS) and overlaying a static image using a video editor. Those are mostly fine for my purposes and they’re very easy. It’s just that what happens when I post something more engaging is fun (like this time I ended up teaching a fellow pedagogue how I create music).

    Soooo… Suggestions welcome.

    (In the meantime, I’ll try to catch up on your VS suggestions. Who knows, maybe I’ll find a way to set things up so it doesn’t require me to redo everything with every project.)

  • @Enkerli said:

    • I enjoy VS and… I’m wondering about alternatives.

    My use case is for very simple music visualizations. Basically, I want to create videos from short recordings I’m sharing on social media. Apart from dedicated services, most social media sites don’t allow me to just upload sound files. Besides, I notice that I get better interactions when I share some kind of video file which relates to the music itself without representing the way I created the recording. I’m not trying to generate “engagement” in the typical social media sense. It’s just about sharing my silly experiments with acquaintances and friends.

    VS does work pretty well for this once I set things up properly. Which I only do after I’m done with my exploratory musicking. By that point, I just want the process to be as quick as possible.

    Ideally, what I’d like would be as easy to use as VS, if not more. And it wouldn’t require a subscription.
    In terms of price, I might be ok with pay-per-use. Or “desktop plugin price”. I don’t care if it’s on iPadOS, on macOS, or on the web.

    There’s a couple of online services which allow you to upload a sound file and generate a decent video. Either their business models are offputting or the workflow is more involved than using VS.

    And there are very elaborate options, both commercial and Free/Libre Open Source Software. Someone who’s motivated enough can do wonders with the aforementioned Resolume… or with Cables.gl, TouchDesigner, p5js, Houdini, Max/Vizzie, Isadora, etc.
    Did dabble in the F/LOSS ones, limitations in TD’s “Community Edition” don’t bother me, and I own a Max license. It’s just that… I probably need enough motivation to get into any of these.

    Other options I’ve used include screencaptures (either on iPadOS or through OBS on macOS) and overlaying a static image using a video editor. Those are mostly fine for my purposes and they’re very easy. It’s just that what happens when I post something more engaging is fun (like this time I ended up teaching a fellow pedagogue how I create music).

    Soooo… Suggestions welcome.

    (In the meantime, I’ll try to catch up on your VS suggestions. Who knows, maybe I’ll find a way to set things up so it doesn’t require me to redo everything with every project.)

    The graphics department (collective for both video and still image) is a jungle of greed and finding any good apps without subscription will be hard. Personally I use many little apps to tweak my own artwork into video. One you can't be without and has no subscription is Glitch Studio, it's awesome. Generate is an app you can buy one time so is JamCam (very cheap sub on the latter, like less than a buck a month). Staella has no subscription but many IAP. Vythm has sub and one time purchase I think. That's what I can come up with for now, it is as said a jungle of mostly overpriced crap but hopefully this helped some.

  • edited February 17

    @Pxlhg said:
    The graphics department (collective for both video and still image) is a jungle of greed and finding any good apps without subscription will be hard. Personally I use many little apps to tweak my own artwork into video.

    Thanks a lot for this insight and those suggestions!
    I didn’t perceive the jungle as I was checking some "trees". Not too surprising that it’d be like that, given the perceived opportunity to make money out of "wealthy (wannabe) creators".

    So I'll try the tools you mentioned. Hope I have the right ones, from the App Store:

    At first blush, I get the impression that Staella is close enough to what I had in mind. While it has some quirks, it works the way I expected, overall. I did buy the “Pro” pack and… honestly, this is pretty much what I needed!
    In fact, I’ll probably have fun with the “SFX pro” controls (Slip, Glitch, Noise, LightLeak, Distortion, Vignette, Mirror, Angle, Rain, Hexagon).

    I know it’s easy to produce something better than that. I’m in a "good enough" situation.

    "I’m a simple person, I don’t need much."

    Also, right after posting, I tried Headliner. It’s mostly designed as a way to promote podcasts through videos. It has a very simple waveform visualizer. Its free plan is quite reasonable (5 downloads a month, without watermark).

    Still, I think I’ll just use Staella for a while, to share my musicking experiments as MP4s.

    Now, VS can do much of the same… if I just use the mixed audio in the same way. Just have to set things up so that it reacts to audio. Not difficult to do, though most presets don't do anything with the audio signal. And I could add text in "postprod".

    So, at the risk of sound lazy… Yeah, I actually prefer something like Staella for my basic needs.

    I'll use VS for more elaborate projects.

    Thanks again, @Pxlhg !

Sign In or Register to comment.