Audiobus: Your virtual music studio.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.Download on the App Store
Audiobus is the app that makes the rest of your setup better.
MIDI-CI and iOS?
There’s been some discussion about MPE support on our beloved platform. In some ways, we’re quite lucky. From Minimoog Model D and ThumbJam to GarageBand and iFretless Brass, we have quite a few MPE-savvy synths on iPad and iPhone.
But how about MIDI-CI? In a thread about physical modelling, a couple of us mentioned the new spec, which “paves the way for the future of MIDI”. CDM’s Peter Kirn also thinks MIDI-CI is the most important part of what the MIDI Manufacturers’ Association has announced two months ago.
Apart from demos involving synthesized drawbar organs, been a bit fuzzy on the details. Maybe some of you can explain a bit more what it’s about.
So it sounds likely that MIDI-CI will become a big deal. But will it come to iOS?
@Enkerli you might want to create a new thread on the MIDI-CI spec. In some ways it seems to extend MIDI using more of a layered network protocol like OSC or document types used in designing web pages like HTML. With increasingly more powerful small computing devices which can network with each other, it does seem like the time is ripe for coming up with a MIDI spec that can utilize this functionality. As we’ve seen with MIDI MPE, I think Apple will be on board with this as it will allow them to use one of their strengths which is hardware integration and partnering with other companies to create a seamless environment for the user (this is their goal at least) with iOS and MacOS devices playing a central role along with the app developers who build on top of this musical infrastructure.
Very interesting point. How will it work? Can devs implement it already? For instance, could @brambos use it in Rozeta? Does it replace, in any way, the use of plugin parameters?