Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
Realtime compiling/interpretation is interesting!
Will there be Poly/Mono, Legato, Glide?
Sharing scripts - can users do that?
It's just MIDI. If you can express it through MIDI you can do it. Mozaic itself doesn't make sound; it's a pure MIDI AUv3 plugin.
Sure! It's all plain text, so there are many ways for getting scripts in and out of Mozaic.
Very good.
I'm going to spend thirty minutes when the time comes and report back
@brambos
Does your warning on the AudioVeek thread apply to Mozaic?
Because hardware has only one GUI, different people will buy the GUI which appeals to them for their usage case. There are certainly MIDI hardware controllers that are more popular and prototypes for others. I can imagine people wanting to have a GUI that matches up to their hardware so they can MIDI map their hardware to Mozaic or perhaps even buy specific hardware to use with Mozaic.
Obviously you will decide how much effort it’s worth to you and it would seem wise to minimize your investment in a GUI or the project won’t be viable for you.
I think there are a wide variety of users and you’ve already established a fan base. If you build the app, people will buy it.
Shall we bring 'Game Changer' out of retirement?
No, that risk doesn’t exist here. I’ve considered the concept of ‘Metavariables’ which live in the scope of all plugin instances, but I decided against it for several reasons (mostly conceptual, rather than technical tough).
Good stuff! Looking forward to it.
Looks like another missing element in the iOS modular toolkit can now be ticked off.
As a Max man, it reminds me a little of Max for Live, but specifically for MIDI.
I'll definitely be sampling the latest of your wares when it's available.
So ... @brambos could have kept this a secret, and just used it to churn out a constant stream of single-purpose apps with minimal effort ... but he provides it to us instead.
That’s class, man!
I like the idea of limiting this to a single GUI. I mean, it all comes down to knobs and sliders really anyway. Where they are or what they look like doesn’t matter that much to me at all.
But, I was thinking, it seems like it would be possible to use this in conjunction with any control surface designer, such as Midi Designer Pro, that supports midi in as well as out. One could place a custom control surface before, after, or alongside, as long as midi can be routed to/from Mozaic.
That kind of routing might be something to think about in the design of the app as I can almost guarantee you some people boffins will want to do something like that.
Indeed, or inserting one of these scripts on the output of e.g. KB-1, using the other plugin as the front-end, should also be trivial.
@InfoCheck
Wait... what? Really now I’vei got some reading to do!
Not buying it unless it’s IAA though. 😂
Looking forward to seeing how easy it is to code, couldn’t really get my head around StreamByter and it’s identifying what you are trying to achieve in the first place.
It might be nice to have a way to display values for the controls which can be toggled on/off for when you’re coding versus playing and want a cleaner GUI?
@brambos
Did you see this?
https://discchord.com/appnews/2019/03/22/holonist-by-holonic-systems
It’s a modular, controllable world.
Now everyone can bleep 🙂🙃
About a year ago, I suggested Nic (MidiFire and Streambyter developer) to add UI controls for manipulating variables in MIDI scripts live.
I'm somewhat surprised but I definitely welcome bram now having his own go at it, including a more "human-friendly" coding language, including a MIDI clock engine.
@brambos
Very, very interesting! The sample pseudo code does seem to be pretty straightforward - looking forward to more examples to see if I could actually work with this. The UI and AU parameter functions are very appealing.
Is that the final UI? Will there be buttons? I'm hoping at least 12 buttons (with latching) so they can be used for note selection.
And with built in LFOs the $64K question is how slow do they go?! That picture shows "Super Slow LFO" and the LFO speed is, I'm presuming, 0-127 * 6000 (or 0-1 * 6000). So what does that represent: 0-6000 seconds?! Hoping @MonkeyDrummer and I will have reason to celebrate!
What?! No hex editing! You evil bastard! If you make it too easy then EVERYONE will be doing it!
That’s getting close, but you still don’t need stop motion photography to see it move.
I’ll make it a point to allow you to create a 24 hour LFO.
I’m not including a ‘midi clock engine’ as such. But you can respond to clock messages ( if the host passes them on to plugins in the first place - which not many do as far as I can tell). I’m not even sure how useful MIDI clock is in an AUv3 chain - that’s a task for the host in normal cases. But it’s just MIDI, so if you can receive it you can handle it.
Could you be convinced ro reconsider? That’s a quite useful feature in StreamByter (global variables) that I take advantage of in scripts used to identify a chord played in one stream and let it affect the midi played out in another. StreamByter is a pain to work with though (even though I have a lot of programming experience) and the lack of GUI is limiting. Another example would be AudioVeeks Midi tools which also takes great advantage of such capability.
OK, you're right, it's more like some kind of clock generator for use in a script that runs in sync with the host transport and bpm.
I'm just wondering what could be the most accessible syntax for building arps, note doublers/shufflers/repeaters, simple controllable, generative sequencers - somehow you have to refer to a timing reference in order to send MIDI messages at the correct time.
I have several methods for that:
The programmable Metronome generates a pulse at a sync-rate of 1-384PPQN (still debating the max rate). The pulse generates an event that you can assign a subscript to:
In a similar way, you can set up a sample-accurate timer which can run independent of the host's tempo (you define the timer interval in milliseconds and it will stay constant even if the host's tempo changes).
YES, that's what I mean!
Cool!
@brambos will the pads support retrieving the (x, y) touch locations? I was thinking for aftertouch and CC modulations, but maybe @aplourde could also use this in a script to turn the 4 pads into 16 buttons or more, just by thresholding on the (x, y) values. In that case, it would be helpful if the app allowed for dynamic labeling of the pads using some horizontal and vertical thin shaded lines to indicate boundaries.
Hoping the background color can be adjusted to easily distinguish between instances (as in recent @blueveek piano roll examples). It could even be cool to have this available for dynamic labeling as well; the color changing by a script command to reflect some state. Let´s say you want to indicate by color which midi output channel the script currently uses, for instance. Could be useful for live jamming
wow, this looks awesome.
will it be possible to send sysex? I know it's a longshot, but I'd love to be able to send messages to hardware.
@brambos interesting project!
Hopefully it includes buffers/arrays you can reference?
E.g. a simple transpose midi script needs to cater for changes in note value AU Parameters between note-on & off