Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
Another question
What is the masters relationship with tracks 2-4 - or how is it interacting with these tracks?
Trying to get a good grasp on the master track
1) Each time the Master is ticked, it will generate a note and send it to a Slave Track (after certain probability calculations)
2) When the Slave Track receives a note, with a certain probability (the % value), it will accept that note
3) If it accepts the note, it will place it at the specified phase offset relative to its current position. So if the phase is 5, and it receives a note while it's at STEP 1...it will place that note at STEP 6.
Wow!
No wonder I am getting intriguing and very useable results
So the master tracks notes do not interact with the other tracks?
It generates a note for itself (no offset) and sends that same note to all connected Slave Tracks. You get interesting results because of the interaction with all phase offsets, probabilities, etc...
I like it. Is there a way to turn off the internal synth when sending midi out?
Just played a few minutes. Sending midi out to Volt. Turned the headphone icon off and back on, then off again. Lost all audio and Volt showed pegged red levels. All audio gone and will need a complete iPad restart to get back.
iPad 10.5 iOS12.1
The headphone icon shuts off the Audio...it just prevents the notes being sent to the internal synth engine, so it's weird you had an audio spike, I'll see what's up.
Is there a significance to the 4th track’s template color being much darker than the top 3?
I was on the effects page when I was turning the headphone off and on. Just tried again after a restart, but it seems fine. Can’t reproduce the global audio loss.
Diggin this app though. Nicely done!
Nope, but it might be a good idea to have all Slave Tracks the same color.
Maybe. The top 3 are close to the same warm grey tone, but the 4th slave is dark grey. Since it stands out it implies there’s something unique about it. The tone on the 3rd slave reads the red in the trashcan icon better than the two above it. I think if you make them all the same, that one works best with your current palette. But, I also like the hint of warmer tones and each being a different tone too. I just wouldn’t make one drastically different from the others to avoid confusion.
I can see you're new to this Forum. Did you build your code on any specific toolset to get started? I'm think of making something too and you're experience to this point could be helpful.
How did you get started? I'm thinking of using AudioKit and the code examples there... likely Swift but maybe. I guess you have to pass the code through Apple to even get it tested. Is that true?
I'm curious if various frameworks make the API bolt-ons less work:
AudioBus - the host here.
IAA - Apple's docs
AU - Apple's docs
What MIDI API's are you using? CoreMIDI? Virtual MIDI, etc?
Any clues would be appreciated. I will test your App. So, far I find the Look to be nicely polished and appreciate the "dark look" at this time of night.
I think I do like it better with all 3 the same color, thanks
The majority of the code is the sequencer / function logic, and the UI...written in Objective-C. I use MIDIBUS for sending MIDI and a modified AudioKit for the sound engine. Apple only has to test it if you want to submit it to the App Store or release a beta. You can write your own apps and install for free with no review on your iDevice. The only catch is that it will expire after a while, so you then have to re-build the code and upload again.
It's daunting at first, but not too bad. My first app was Collider, which was an intelligent randomizer for the Rytm. I started with one ugly button on the screen that when pressed, would send out a single CC value. Once I knew I could send MIDI, I just started building upon that. Start small.
Thanks.
I had a nice play-around with the testflight version and liked what it generated. Its nice that one is able to somewhat modify the notes that are output by drawing into the grids. I did setup several snapshots for an evolving jam (quieter and more busy parts). Could you add the snapshot pads to the save file/presets, they were not recalled when i lated wanted to redo that session.
.
My personal workflow shifted from using external apps (IAA) to an AU centric workflow where everything is saved inside a single hosts session and can be totally recalled without the need of any preset loading, making connections etc.
Thats why i would prefer this app to be an AU midi generator just generating the notes and randomization UI without any audio. And the synth could be a second AU generator module supplied by the same standalone app. (Like iVCS / apeMatrix / Rozeta supply several different AUs in one app)
.
The guys with external modular synth or the ones doing live session with several iPads probably want PolyPhase to be a fullscreen stand-alone app, mabe integrated onto an AB session to allow adding the synth sound - thats why they are calling up for the Ableton LINK / AB feature
@mekohler
Truly incredible instrument you've built here. I've only spent about an hour with it and I can
already tell this is going to get a lot of use. I think comparing this to FM is selling yourself short. This is in an entirely different league!
I really like that triplets are included, and would love to see a broader selection of subdivisions - dotted, etc..
I'd personally love to see a range of subdivisions rather than traditional "note lengths" which I find not as helpful when thinking polyrhythmically - so instead of a 1n, 1t, 2n, 2t, 4n, 4t, 8n, 8t... (which skips tuplets other than threes) imagine a list that denotes how the bar is divided: 1, 1.5, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13... 32.
this way you could combine an odd division with an odd bar length for some very interesting polyrhythms/polymeters. Not to say that it isn't already quite capable of getting into some very interesting territory, just thinking out loud here.
I realize that maybe most people prefer thinking in terms of whole note, half note etc.. personally I find the 1n, 1t, notation visually confusing and it's too easy to unintentionally select a triplet.
(another observation: the transpose adjusts on the y axis whereas the other parameters are adjusted on the y axis.)
Also, midi in for master transposition would be very cool
oh and yeah, and I agree with _ki on the Auv3 thing. I know most people around here would prefer to see it as a plugin (myself included)
that said, I'm pretty excited about this thing and respect whatever direction mekohler decides to take it in. Clearly has some good sense!
Perfect.
perfect.
perfect.
Perfect.
It is a quite common "fault" with apps, though it would be much appreciated to see it "fixed".
Some ideas on that:
Other things for finer detail adjustments: Increment full steps (or steps of 10, depending on the scale of things) on the Y-axis, and decimal adjustments (0.1, 0.4 etc) on the X-axis.
The latest DAW for iOS, Stagelight, has the exact same issue: fingers covering the volume control in the channel strip meaning I have no idea if I just adjusted a track up or down by 4db, until I remove my finger (or of course use my ears, which however isn't that fine tuned to the decimals). That they keep the entire mixer in a 1/4 of my 12.9 screen, with the rest of the screen black, doesn't help of course.
Just saying getting all these little details right probably takes time, and any attempts to do so would be appreciated, regardless of what the app does.
@mekohler how do the snapshots work? Not sure if they are working at all on my device, can't seem to save a snapshot at the moment....
if you ever decided to make this an AU plugin, you could perhaps have a separate module which was also a CC step sequencer with similar generative connections etc.
Well thanks mate, I’ve been up all bloody night playing with this thing. It’s incredible! Am I being daft asking for a swing function or can you replicate that using the phase offsets?
Also if you feed ReSlice from it in pitch mode, you get some crazy stuff!
So in love with this.
So to allow testing of your App and use TestFlight you had to submit the App to Apple, right? I'm curious about that process if you could share any details.
I have Xcode installed after finding more primary storage... apparently it won't run from an external disk.
I hope your story inspires more to consider making something.
I was hesitant to code in Apple's "Walled Garden" with languages that only work on IOS and OS X and only run on Apple devices...
But.
IOS has captured my musical imagination due to the price/value of the Apps available and the key are the great developers making incredible tools for us all. Love or Hate Disneyland but it's the Happiest Place on Earth, right? I'm just a big kid.
You could do as I do: only use proprietary "Apple" code for the proprietary parts of the app (e.g. UI, interacting with system frameworks, etc.) and write the rest in bog-standard C++. That way the parts of your app which are most valuable are easily ported to other platforms later on. Xcode lets you mix-and-match languages inside a project without hassle.