Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
Will particles be an iap? Not that it’s acprobl cause it looks just like the kind of generative thing I was mentioning @brambos
Can't wait!
Free update
https://media.giphy.com/media/1zC0cBBz9UH4I/giphy.gif
This particle addition is an awesome creative plus to this great app.
Thanks for thinking outside the box @brambos
keep em coming!
‘Rosie the Riveter’ nailed it...
I don’t want any free update.. I will like to see the XoX UI be more like GarageBand’s.. separate controls for the ‘mut’ function, etc. if we can also get a melodic (step seq) version of it will be cool too.
Edit... separate probability, note repeat, velocity controls etc..
Blimey, thank you.
Right - zero is a value! Think of it this way - we call this the 21st century.
You know that, I know that, but whoever wrote the app upgrade notes missed it. I’m just correcting the record. (And shaking my head over how often very bright programmers trip up on that issue out of carelessness. I’ve seen it over and over).
That's too funny. I remember a lot of confusion with 0 and 1 with regards to start frame numbers and frame counts when I worked in 3d animation. The compositors and animators would be scratching their heads trying to figure out which frame number they were referring to or how long a sequence was supposed to be, all caused by some people calling the first frame "frame 1" while others called it "frame 0."
I'm happy @brambos was able to come up with a solution & not miss a beat. I think the new name Rozeta suits the app suite better. But...as @brice said they must have issues with Korg too, right? Nah... regardless of who came first they won't ring Korg's bell because it's a multi-million dollar corporation & their lawyers would go medieval.
Either way, Bram is a class act all around for being so accommodating.
It's hilarious what @MonzoPro said, true too. Original names for any kind of intellectual property is tough business after 100 years + of recorded patents, commerce and advertising. If you think of a cool name & really do your due diligence you'll know pretty quick if it's already got a flag stick in the center of it.
That's why as bland & wha??? as it may be I have used my name & initials for most of my music stuff, it can be the path of least resistance for sure.
Mmm Rozeta particles
Diamond geezer!
Seven pages of guff about a product, and not one single link to it on the App Store?
Apple’s brain-dead approach to app discovery should mandate a mandatory App Store link on the first page of any app-centric post, I propose. Indeed.
Rozeticles!
Awesome!! Which Synths were being triggered?
Yeah, these were the fun factor in Caelestis app. Not sure that’s available anymore.
I think it was Phosphor for the lead part and Zeeon for the bass.
But I’m wondering if that isn’t just for the eye candy, because I can’t imagine the audible outcome being very different with other shapes?
Ah, thanks. I don't have Phosphor, I'll take a look.
To be honest, I prefer a more 'organised' generative sound ( ) as in your video, but shapes like Triangles, Circles, and polygons are much more random when the particles bounce into corners and rebound quicker.
The Shapes also rotated in Caelistis.
I really like the look and sound of your Particles plugin. I just mentioned Caelistis as @rrc2soft 's comment reminded me of that app
...
Yo @brambos and anyone else who cares to test. I’m sending a Rozeta arpeggiator and Rozeta XY (cuz I’m stuck with a Minilogue as a controller and it has no modwheel), and using the latter only as a modwheel, into the same AU synth, especially Zeeon. This setup works for about 2-3 minutes before crashing, but it crashes every single time, usually after I switch windows from the arp to the XY and then touch the mod wheel.
Just tried that here (let the Arp run on Latch for 10 minutes, then switched to XY/Modwheel) fed into Zeeon but I'm not getting any strange behavior or crashes. I don't have a Minilogue so I was using the internal AUM keyboard but I can't imagine that making much of a difference. Is there anything else in the equation that could have an influence?
Would be lovely. Even lovelier with transpose.
Speaking of my Berlin-school obsession with sequence transposition, will it be possible to transpose the output from Particles via MIDI?
Also, will it have the 8-memories + follow actions set up some of the others have?
Yes, MIDI Transpose is already implemented.
I played with the idea, but dismissed it eventually because the concept of "start of loop" and "loop duration" are completely unclear. So the Follow Actions (and also the trigger of another pattern) would become quite confusing affairs.
I’ve got nothing else running besides AUM, Zeeon, 2 Rozetas (and the Minilogue connected by CCK). I’ll report back if I can figure out exactly what triggers the crash.
Thanks. I'll try on my other iPads as well, just in case.
Sweet!
Yeah, that makes a lot of sense, just from seeing the previews. Could it be divorced from the sequencer's internal notion of a 'loop' and just defer to the host's? Less from a 'request' point of view, more interested in your take.
How does the midi transpose work? Triggered with another Rozeta instance?
Any MIDI input works, e.g. try it using the AUM keyboard:
But nothing stops you from using another Rozeta plugin to send those MIDI notes.
Exactly my thinking. Touchscreen is fine, except when you need your hands or eyes (close to) somewhere else, in a live situation, what music is all about for me, even at home. With Midi, I can use my feet, my nose or another algorithm to fire and control. With the screen, I'm bound to the imagination the developer had about my workflow. Which rarely fits.
So what I can't tweak over midi, in the end I won't use. Screen is good for locating functions first, though.