Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
To unlearn a parameter, simply activate the midi learn mode and double click on the desired parameter. It should clear its midi learn state and a "N/A" label should appear.
We definitely need to create a decent mapping manager where we can see the assigments. That will take some time though.
That isn't working for me...using the AU on a 9.7 inch iPad in iOS 13.7. Double-tapping the parameter's knob or label have no impact. I am working in AUM. EDIT: it does work...but one has to be sure no MIDI comes in while trying to unlearn.
Also, even though I have never saved changes to the Factory 'default' patch I am stuck with it having the last state it was in. There seems not to be a way to return to its initial settings.
A couple of more questions:
@xglax @sinosoidal : I've edited my comment above to indicate that while double-tapping will unlearn an assignment if no MIDI is coming in, if any MIDI is coming in, you can't unlearn. I'd recommend changing the mechanics here. Perhaps if a double-tap happens during midi learn you should clear the knob without making it active for assignment.
@sinosoidal wondering if you have a road map for new features that you might be able to share with us. Thanks
Yes, sure! We have been working hard on output to syphon (mac) and spout (windows), as well as render to file. This has implied a new render engine architecture. A very delicate and complex task.
In theory we have render to file working already, but in practice is not ready for production yet as the frame rate drops heavily when it is recording (below 30 fps). We need to find a way of making it more performant. It will take a while longer.
We have also worked on a in-app giphy browser, where you can search, preview and load directly from VS.
We have submitted a new beta that should be available soon for iOS. This build features the new render engine but without the render to file functionality enabled as it is not usable yet.
Wow. Amazing. Any longer term plans for opening up the materials ?
Yes the Hue of a material can be controlled by either midi, LFO or audio.
Select the 3 slider bar icon to the right by the side of the layer marked B for background.
You will be presented by 1 of 2 windows. Widow 1 will have a moving display of the LFO and window 2 has the parameters displayed. The 3 yellow bars above display the controlable parameters. Click either side of the yellow bar to open more parameters. The control for HUE is on the lefthand side of the first menu.
Hope this makes sense.
Yes, sure! One problem at a time!
Hehehe ❤️😎💪😍🥂👍🤪
By the way, the 1.1.0 beta is now available on TestFlight.
I've seen that matrix window but I don't see anything that lets you map CCs to those parameters. From what I can see, you can assign KBD (which I believe is midi note number) or Velocity. Is there a way to map CC to the items in the matrix?
Since the AU exposes its params to AUM, I can use AUM's MIDI Control to control these. But I wonder if I am overlooking something.
OK sorry I admit defeat. I have tried to change Hue using midi but so far without sucess. I just figured that it should be possible, though in truth had not tried myself. Seems daft to have parameters that cant be that can’t be changed using midi.
UPDATE the KBD is the same as Midi in I believe. So using that and ensure the trigger button say Midi. Should work.
Genuine question, does anyone use giphy? They seem a bit dated or perhaps aimed at a younger audience to me. Pexels perhaps?
I think KBD is MIDI note in. If using AUM, the Hue, etc has been exposed as an AU parameter. So using AUM's MIDI Control is the way to go to use non-note MIDI for this. It has the added benefit of being able to map the range.
I do! The new beta is fantastic!
The new beta version is good in that it allows instant access to giphy but for me me the giphy clips do seem aged at a young “TikToc” like audience. I am 62 and still create surreal art but I would like access to longer more arty clips. Though another possiblity might be to run a series of short clips one after another.
Oh for sure. You have to dig under the surface to find good stuff on Giphy.
@Toastedghost here’s a quick video that may help anyone trying to automate an au parameter in AUM.
@espiegel123 I’m pretty sure AU parameters is the only way. There are over 150 exposed parameters per layer. With only 127 cc’s per channel it’s impossible to pre-assign cc’s to them.
@xor: What a great, to the point instructional vid.
Thanks for doing that.
Midi set up in AUM remains a mystery to me even though I use it every time I open a session.
The new Giphy thing is freakin COOL!
What is Giphy and yes I’m on the beta ? And yes I’m 59 so pardon my tech not being up to date !
Superb roadmap
Currently, the only way to midi learn the Hue and Saturation parameters is as you said it, using the AU exposed parameter in AUM for example, and using the AUM itself to map it. The other parameters can be midi learned with VS internal midi learn function
Exactly, KBD is the midi note that is coming in in the midi note on event. The midi note 60 represents the middle point, and all the note on events will see the note value and make the proper proportion. The bigger the difference between the received note and the reference of 60, the bigger the modulated output value
Awesome to see Giphy support. It would be cool if multiple layers of Giphy were supported, so it can used in place of a material. Also, more options and mod matrix support for messing with backgrounds and Giphy gifs would be great.
Yes, the giphy integration is just soooooo cool and easy and useful. But it immediately leads to “if only I could layer two gifs or three and then switch to another bank of gifs for the back half of the song “. I trust imaginando to get there in time and it’ll be an everyday tool at that point.
You can achieve this partly by duplicating the preset, change the background layer, the change preset via CC
True, but an extra step and pauses as the next preset loads? I should do an experiment.