Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
Or you can go completely out of its intended use.
Setup in AUM from yesterday:
Turnado on master channel.
Switching scales and changing root key in Particles + Turnado...
3 hours of instant fun
I’d love to see a video of that setup (and what it sounds like) @recccp
Hey folks, first update with general quality of life improvements is out (1.1)!
Thanks everyone for all the feedback on the release version. And hope you're enjoying MIDI Tools
New video showing how to route MIDI around inside Cubasis.
tl;dw: Layer up tracks without copying MIDI in the arranger, or do other complex in-app routing otherwise impossible in Cubasis, like you would in AUM or apeMatrix
Nice update. Thanks for the quick work.
I did laugh at @MonkeyDrummer "space station toilet" reference but I think it's also a little unfair to stream byter, which is really quite handy as well.
@blueveek Nice work on these plugins! I am sure to get a lot of use out of them.
A couple feature requests: On the Key Zone plugin, it would be handy to be able to set the zones by dragging the handles on the on-screen keyboard instead of having to repeatedly press the +/- buttons. Additionally, it would be great if there was a way of setting the zones by playing the notes on a midi controller (i.e., midi learn). The Midiflow Splitter for Audiobus works this way.
Your Midi Curve plugin is close to perfect; the only thing I'd like to see is some form of visual feedback of the midi input and output levels on the XY graph. This would make setting the curve much more intuitive.
Finally, I'd love to see a midi random / probability gate plugin like the Midiflow Randomizer and the midiFILTr-PG by Art Kerns, both of which are great, but sadly not AUv3.
I was just thinking the same.
You start out with a perfect analogy.
Then extend it to infer a specific payload:
True, of course, but harsh... very harsh.
... FLUSH Another Track uploads.
I'm not really ripping on it. It's just the nature of the beast. If it had a UI that actually made it more usable I'd live it. Programming it via punch cards and assembly language... Not so fun...
I tend to look at it more like that MobMuPlat app for running PD stuff that I find online.
You can run other people’s scripts in StreamByter in the same sense.
I was all gung-ho to learn to script it myself, but of course I got sidetracked by all the ‘easier’ apps. Hopefully one day I will get back to it.
Hey, how did you know the exact release notes of the next update?
This one's a spicy proposition.
Here's a quick video showcasing velocity feedback in the Curve plugin, coming soon:
@blueveek
probably issue with Monitor in Cubasis - look at "Time" - all played notes shows Time: 0 (in Nanostudio it works correctly, i see note times)
It is allowed to send out MIDI events with time=0. This is interpreted by Core Midi as "play now". But the preferred method is to put a proper time stamp on midi events and send them out ahead of time, to avoid jittering and such timing artifacts. Does this screen capture show that the Cubasis sequencer is actually not doing that?
yes Didn't know that "time 0" is allowed (or even possible) to send .. so obviously it's bug on Cubasis side, thanks for clarification
It is indeed allowed for CoreMIDI, but I'm not 100% certain it's also allowed for AUv3 MIDI communication. If I remember correctly AUv3 MIDI messages (both going in and out) always need to have absolute timestamps. You can verify that if you look at how Apple's AU samplecode interprets incoming MIDI messages.
I'm not doing anything differently based on hosts, it's all the same logic everywhere. Best I can do is push out an update with more timing info per notes (event time vs. render block note time etc.) to highlight subtle timing differences in how hosts treat absolute vs. relative timing.
FWIW, BeatMaker 3 also shows the correct times.
@blueveek this is on Cubasis side obviously ... Had discussion with Matt, he told me that that to him looks that this "Time" number which you display is actually sample offset in current audio bufffer - which means that Cubasis is probably sending all midi events which occured during audio buffer always just on beginning of next audio buffer all together.
Which to me looks like Cubasis should have not best MIDI timing. Didn't tried it aclually but looks like that. Actually with it's 48 PPQN used by sequencer this is not so surprising for me
for auv3 midi we just use
osstatus = _outputEventBlockCapture(AUEventSampleTimeImmediate, 0, 3, threeBytes);
This shows 0 in the monitor too - same thing with AUM keyboard say.
I did extensive tests with Jonatan on this (back when things were undocumented and we had to establish how Apple intended this to be used). The AU framework converts timestamps internally, so what the receiver gets to see is not necessarily the same thing as what the sender originally sent.
When I double-checked with Apple later, here's what they replied:
Excellent.. have not got into these tool much yet.. but feels great to know they are mine....🙏
Thanks..
Edit.. and will be mine..😎
Correct, that's exactly what that number represents.
Btw. Monitor helped me today a lot to identify reason behind one very strange issue, so thanks for it, very handy tool for debugging midi, for me most valuable plugin in whole package !!
Awesome to hear that, thanks!
I think that's a great idea, as things are now it's hard enough sometimes to know what's going on so additional info is always welcome!
Yeah, ideally the outcome of this would mean that both DAW and AU developers have easier insight into what's actually going on and how DAWs behave differently with regards to MIDI timing, considering that this has been a huge source of development pain for both parties. This can be debugged without the Monitor, but it's more arcane.
Exactly, that's what I was thinking.
BTW, compliments to your great work. MIDI does deserve some more love nowadays
@blueveek Another voice of thanks for developing these tools. Over the last week I've used several without thinking about it. I would have found other ways of accomplishing what I was trying to do, but your AU tools made the task super fast and super easy. Well done!
Very interesting tools!
I see some similarity to some of the MidiFlow tools, and maybe the Audeonic Apps!
Got them anyway, as it seems to be in order to support the development.