Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
@sinosoidal
I’m not sure if GitHub is the place to report bugs, I posted there.
I’ll report here as well:
I can repeat a crash when using VS Mac OS version, while using sidecar on the Mac and having the application dragged to iPad screen.
As soon as I click on it with an Apple Pencil it crashes.
Let me know if you need more information.
Is there any way to import an audio file into the standalone app or do you have to use a host?
I don’t think you’re getting what I’m suggesting. Not generating midi from the visuals. I’m talking about the LFOs in VS. send that midi out. That way we could have LFOs driving graphics the way we want via in-app LFOs… then send those same LFOs out to a pad or other instrument outside of VS.
NOT, the visuals generating midi. Is that more clear?
Actually, it might open up interesting possibilities for creating audiovisual work, for example taking the MIDI output and feeding it into a generative patch in MiRack (that’s already providing the audio for VS) so it can modulate something (or multiple things).
However, I can understand that a) you have a huge to-do list already, and b) the above might not be useful to that many people.
Yes, sorry! Something to consider in the future.
Curious... is midi out something as simple as simply turning it on? Or, would it take a good deal of development and testing?
I ask because I don't actually know how useful it would be. If it's just a matter of turning it on... then I'd love to experiment with it. If it's a great deal of extra effort, then yeah... maybe in the future.
There's another app in the appstore called VOSC and it's a visual synth. It doesn't make sound, but it uses ocsillators to manipulate particles. Looks to be abandoned, but still looks interesting.
I asked about this because the Shockwave synth is basically a contained modular. It's got lots of oscillators, LFOs, and sequencers built in for you to work with... and it has midi in for playing Shockwave as a synth. But it ALSO has midi out. So you can have your oscillators and LFOs creating sounds within the app. Then ALSO send those same sequences, LFOs, etc. as midi out to another synth to be effected in-sync with what you've got going on inside Shockwave. It's REALLY cool to experiment like this, and I'm betting you could do similar stuff using the VS midi out from it's LFO's.
Just a thought. Love the app. Thanks for at least considering it.
Cheers
I had a look at the review section for VOSC and
a reviewer stated that it doesn't have midi in
which is a pity as it looks very interesting.
@sinosoidal
A couple of things if you don't mind.
Is it possible for VS to remember it's sync settings when
closed and reopened in standalone mode?
It defaults to internal sync.
If this is deliberate then please ignore this.
Can we reset the individual LFO phase using a controller message?
I'm trying to get the LFO to start on the one when synced.
It feels like a global key press rather than being
able to assign a specific key press or gate signal.
I'm going to be creating a template in dRambo for VS.
Obviously I could use the LFO's in dRambo but
I'm learning the capabilities of VS first.
Send the internal LFOs and EGs to the midi outs.
@sinosoidal hey, is it possible to start modulation at the beginning of a musical phrase and then fade it out at the end of that phrase? I thought that an EG with a slow attack and slow release would work but each key press and release re-triggers the eg so only a long sustained note hits the maximum. Similarly as a trigger, is there a way to keep the trigger high as long as notes are coming in reasonably (user-defined) quickly.
Also, can you look at hue modulation on the solid color material? Hooked to a lfo it seems that all positive values are red and then all negative values cycle through all the colors, including red.
[AUM with keyboard routed to VS in an effects slot] Also the solid color brightness EG modulation seems, well, odd. With trigger set to none, set brightness full on. Everything is good. Now set EG modulation of brightness to -1. Press a key and screen fades to black. Almost expected, there’s no trigger so why is it responding to midi? Now change trigger to midi and screen goes immediately black. Why? Brightness is full. Now press a key, with EG modulation -1 or +1nothing happens. Set brightness off, EG to -1 or +1 same. Guessing that trigger is hooked up backwards on solid color?
Also, lfo modulation of brightness doesn’t appear to change when lfo is positive.
Thanks
Yes please.
LFO's and EG's are generated at audio level, this means there will be 44100 samples per seconds, assuming that sample rate is
44100. It is not feasible to simply redirect this amount of values to midi out. Therefore, this would need to be sampled, let's say at, 30 or 60 fps (selected frame rate).
We would also need to add parameters to manage the output. We don't want to have all the LFO's spitting it's output to MIDI OUT at the same time. There would need to be some kind of setting to say which LFO goes, to each channel, and so. This means we need additional UI and parameters...
Sorry but we have many important things to tackle before even considering this feature.
I think it is possible. Let's take a look.
If we had a reset button on the LFO UI, that button could be assigned to a midi control.
What do you mean with "start on the one when synced"?
I also didn't get this one.
If you want more control you should use something that allows you to automate in time.
For instance, LK has automation abilities. You can set an automation lane for a certain CC and then use that CC to modulate a parameter in VS.
Let's say you are using EG to modulate brightness. You would take this modulation out and modulate brightness directly with an automation lane.
Makes sense?
Can you do a simple video that replicates the issue and share with us?
I did a quick search for MIDI out in visualization apps. I think it is scarce even on the desktop.
Magic Music Visuals and Synthesthesia don't.
VDMX does have full bidirectional MIDI mapping:
https://docs.vidvox.net/vdmx_data_io.html
The exact search string must have been "sitting on toilet surreal gif" by the looks of it 🤓
Can someone explain simply why midi out from VS is desirable ? Is it to drive another synth?
Scroll back a few pages. More than one person has mentioned possible applications. No matter, it’s not likely to be implemented (which is fine by me).
Any chance of being able to load video from the Photos app?
I think some hosts might not support music effect so it's probably useful to have both AUs
When synced to tempo the LFO's freewheel in regards to phase?
When sent a reset signal or key press they begin cycling again?
So I wanted to find out if we could assign a key press to start the phase
of an individual LFO at the beginning of a loop without affecting
the others and without it being reset every time.
The LFO's react to all incoming key presses rather
than a specific key press or gate signal?
So when they are set to 'phase' they react to every key press?
which is why I said 'global'.
In this screenshot you can see where I triggered the LFO
and it's phase gets reset with every key press.
We have seen this problem before during our tests. One solution is to have a filter by channel for the LFO. Instead of reacting to all note ons from all channels, it would react only to the ones of certain incoming channel.
Ahhhh....
So is it possible for us to assign four separate midi channels, one for each LFO?
That way we can treat the phase start point of each LFO individually
and still be able to play VS using a keyboard or midi controller?
Okie dokie
Silly question, but when you run in standalone mode how do you get audio in?
If you're in iOS it will receive the input of your device. If you're in mac/win It will receive the audio input of your selected audio device:
If you wish to use the internal audio (in mac/pc) you have to have some kind of loopback software so you can route your internal audio to VS
Long press on B the background layer reveals files, resources and photos file folders. Inside the photo folder find the video you want.
No videos showing up there. Only in Files. There is no photos folder in the files app if that is what you meant.
+1
I found the same thing yesterday.
I copied the video over to 'Files' so that VS could load it.
Same here, just hoping to avoid that as I have different places for different types of video.