Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

New App - PolyPhase - Beta is open - not mine but seems very cool -

1356713

Comments

  • Another question :)
    What is the masters relationship with tracks 2-4 - or how is it interacting with these tracks?
    Trying to get a good grasp on the master track

  • @Mayo said:
    @mekohler - thanks for the lock tip.
    Can you explain the phase function - is it like a step delay from the master track?
    What is happening?

    1) Each time the Master is ticked, it will generate a note and send it to a Slave Track (after certain probability calculations)
    2) When the Slave Track receives a note, with a certain probability (the % value), it will accept that note
    3) If it accepts the note, it will place it at the specified phase offset relative to its current position. So if the phase is 5, and it receives a note while it's at STEP 1...it will place that note at STEP 6.

  • Wow!
    No wonder I am getting intriguing and very useable results :|

  • So the master tracks notes do not interact with the other tracks?

  • edited November 2018

    It generates a note for itself (no offset) and sends that same note to all connected Slave Tracks. You get interesting results because of the interaction with all phase offsets, probabilities, etc...

  • I like it. Is there a way to turn off the internal synth when sending midi out?

    Just played a few minutes. Sending midi out to Volt. Turned the headphone icon off and back on, then off again. Lost all audio and Volt showed pegged red levels. All audio gone and will need a complete iPad restart to get back.

    iPad 10.5 iOS12.1

  • edited November 2018

    The headphone icon shuts off the Audio...it just prevents the notes being sent to the internal synth engine, so it's weird you had an audio spike, I'll see what's up.

  • Is there a significance to the 4th track’s template color being much darker than the top 3?

  • @mekohler said:
    The headphone icon shuts off the Audio...it just prevents the notes being sent to the internal synth engine, so it's weird you had an audio spike, I'll see what's up.

    I was on the effects page when I was turning the headphone off and on. Just tried again after a restart, but it seems fine. Can’t reproduce the global audio loss.

    Diggin this app though. Nicely done! :)

  • @skiphunt said:
    Is there a significance to the 4th track’s template color being much darker than the top 3?

    Nope, but it might be a good idea to have all Slave Tracks the same color.

  • edited November 2018

    @mekohler said:

    @skiphunt said:
    Is there a significance to the 4th track’s template color being much darker than the top 3?

    Nope, but it might be a good idea to have all Slave Tracks the same color.

    Maybe. The top 3 are close to the same warm grey tone, but the 4th slave is dark grey. Since it stands out it implies there’s something unique about it. The tone on the 3rd slave reads the red in the trashcan icon better than the two above it. I think if you make them all the same, that one works best with your current palette. But, I also like the hint of warmer tones and each being a different tone too. I just wouldn’t make one drastically different from the others to avoid confusion.

  • @mekohler said:
    Hi, this is my app.

    I can see you're new to this Forum. Did you build your code on any specific toolset to get started? I'm think of making something too and you're experience to this point could be helpful.

    How did you get started? I'm thinking of using AudioKit and the code examples there... likely Swift but maybe. I guess you have to pass the code through Apple to even get it tested. Is that true?

    I'm curious if various frameworks make the API bolt-ons less work:

    AudioBus - the host here.
    IAA - Apple's docs
    AU - Apple's docs

    What MIDI API's are you using? CoreMIDI? Virtual MIDI, etc?

    Any clues would be appreciated. I will test your App. So, far I find the Look to be nicely polished and appreciate the "dark look" at this time of night.

  • @skiphunt said:

    @mekohler said:

    @skiphunt said:
    Is there a significance to the 4th track’s template color being much darker than the top 3?

    Nope, but it might be a good idea to have all Slave Tracks the same color.

    Maybe. The top 3 are close to the same warm grey tone, but the 4th slave is dark grey. Since it stands out it implies there’s something unique about it. The tone on the 3rd slave reads the red in the trashcan icon better than the two above it. I think if you make them all the same, that one works best with your current palette. But, I also like the hint of warmer tones and each being a different tone too. I just wouldn’t make one drastically different from the others to avoid confusion.

    I think I do like it better with all 3 the same color, thanks :#

  • @McDtracy said:

    @mekohler said:
    Hi, this is my app.

    I can see you're new to this Forum. Did you build your code on any specific toolset to get started? I'm think of making something too and you're experience to this point could be helpful.

    How did you get started? I'm thinking of using AudioKit and the code examples there... likely Swift but maybe. I guess you have to pass the code through Apple to even get it tested. Is that true?

    I'm curious if various frameworks make the API bolt-ons less work:

    AudioBus - the host here.
    IAA - Apple's docs
    AU - Apple's docs

    What MIDI API's are you using? CoreMIDI? Virtual MIDI, etc?

    Any clues would be appreciated. I will test your App. So, far I find the Look to be nicely polished and appreciate the "dark look" at this time of night.

    The majority of the code is the sequencer / function logic, and the UI...written in Objective-C. I use MIDIBUS for sending MIDI and a modified AudioKit for the sound engine. Apple only has to test it if you want to submit it to the App Store or release a beta. You can write your own apps and install for free with no review on your iDevice. The only catch is that it will expire after a while, so you then have to re-build the code and upload again.

    It's daunting at first, but not too bad. My first app was Collider, which was an intelligent randomizer for the Rytm. I started with one ugly button on the screen that when pressed, would send out a single CC value. Once I knew I could send MIDI, I just started building upon that. Start small.

  • @mekohler said:
    Objective-C
    MIDIBUS for sending MIDI
    AudioKit for the sound engine
    Apple only has to test it if you want to submit it to the App Store or release a beta. You can write your own apps and install for free with no review on your iDevice.

    Thanks.

    • Start with Swift
    • Get something to work - a button that does something
    • (Likely compile an example and toss out code to start)
    • Add more code but keeping it working at every step
  • _ki_ki
    edited November 2018

    I had a nice play-around with the testflight version and liked what it generated. Its nice that one is able to somewhat modify the notes that are output by drawing into the grids. I did setup several snapshots for an evolving jam (quieter and more busy parts). Could you add the snapshot pads to the save file/presets, they were not recalled when i lated wanted to redo that session.

    .

    My personal workflow shifted from using external apps (IAA) to an AU centric workflow where everything is saved inside a single hosts session and can be totally recalled without the need of any preset loading, making connections etc.

    Thats why i would prefer this app to be an AU midi generator just generating the notes and randomization UI without any audio. And the synth could be a second AU generator module supplied by the same standalone app. (Like iVCS / apeMatrix / Rozeta supply several different AUs in one app)

    .

    The guys with external modular synth or the ones doing live session with several iPads probably want PolyPhase to be a fullscreen stand-alone app, mabe integrated onto an AB session to allow adding the synth sound - thats why they are calling up for the Ableton LINK / AB feature :)

  • edited November 2018

    @mekohler

    Truly incredible instrument you've built here. I've only spent about an hour with it and I can
    already tell this is going to get a lot of use. I think comparing this to FM is selling yourself short. This is in an entirely different league!

    I really like that triplets are included, and would love to see a broader selection of subdivisions - dotted, etc..

    I'd personally love to see a range of subdivisions rather than traditional "note lengths" which I find not as helpful when thinking polyrhythmically - so instead of a 1n, 1t, 2n, 2t, 4n, 4t, 8n, 8t... (which skips tuplets other than threes) imagine a list that denotes how the bar is divided: 1, 1.5, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13... 32.

    this way you could combine an odd division with an odd bar length for some very interesting polyrhythms/polymeters. Not to say that it isn't already quite capable of getting into some very interesting territory, just thinking out loud here.

    I realize that maybe most people prefer thinking in terms of whole note, half note etc.. personally I find the 1n, 1t, notation visually confusing and it's too easy to unintentionally select a triplet.

    (another observation: the transpose adjusts on the y axis whereas the other parameters are adjusted on the y axis.)

  • Also, midi in for master transposition would be very cool :)

  • oh and yeah, and I agree with _ki on the Auv3 thing. I know most people around here would prefer to see it as a plugin (myself included)

    that said, I'm pretty excited about this thing and respect whatever direction mekohler decides to take it in. Clearly has some good sense!

  • @mekohler said:
    Yep, Ableton Link is on the list. The first update after release will add selectable interfaces per Track, instead of just MIDI Channels. Once I feel everything is stable and I clean up the code, I will tackle integrating it with the outside world (Link, IAA, etc).

    Perfect.

  • @mekohler said:
    CC Controls of important parameters is a very good idea!
    I'll add those to control all of the Track properties (right side of a Track), important Synth parameters, and some CC's to toggle some of the randomization functions below.

    perfect.

  • @mekohler said:
    1) I agree on the WIFI Icon, I'll fix that for release. Snapshots being too dim is addressed in the next beta.

    2) I wanted all settings and functions to be accessible so you can play it live, but that does result in some small UI. I'll definitely add a toggle to fold them away, so the sequencer takes up the entire space, or use a larger pop-up.

    3) I'm used to the Elektron style transport where they are separate buttons, and I wanted Start / Stop to send out their relevant MIDI commands regardless of the state of my sequencer. I'll remove the dimming, however.

    Lots of good suggestions so far :o

    perfect.

  • @_ki said:

    The guys with external modular synth or the ones doing live session with several iPads probably want PolyPhase to be a fullscreen stand-alone app, mabe integrated onto an AB session to allow adding the synth sound - thats why they are calling up for the Ableton LINK / AB feature :)

    Perfect.

  • @mekohler said:
    Yes I know that small UI needs work, not sure what to do about it yet. Even though the UI is small, once you press your finger down on a value and start moving up / down, you can move your finger over a little to the right so you can see the values and not have it under your finger. Not ideal, I know!

    It is a quite common "fault" with apps, though it would be much appreciated to see it "fixed".

    Some ideas on that:

    • As you put down your finger on an adjustable ring control, it enhances the size of said ring to safely end up outside any area the finger would cover.
    • Alternatively you have a visible label for the value X px to the left/right of the control.
    • I would love to see someone else implement how Thor (synth) does it, which basically "projects" out a precise bar for the control you are adjusting.

    Other things for finer detail adjustments: Increment full steps (or steps of 10, depending on the scale of things) on the Y-axis, and decimal adjustments (0.1, 0.4 etc) on the X-axis.

    The latest DAW for iOS, Stagelight, has the exact same issue: fingers covering the volume control in the channel strip meaning I have no idea if I just adjusted a track up or down by 4db, until I remove my finger (or of course use my ears, which however isn't that fine tuned to the decimals). That they keep the entire mixer in a 1/4 of my 12.9 screen, with the rest of the screen black, doesn't help of course.

    Just saying getting all these little details right probably takes time, and any attempts to do so would be appreciated, regardless of what the app does. :)

  • @mekohler how do the snapshots work? :) Not sure if they are working at all on my device, can't seem to save a snapshot at the moment....

    if you ever decided to make this an AU plugin, you could perhaps have a separate module which was also a CC step sequencer with similar generative connections etc.

  • @mekohler said:

    @McDtracy said:

    @mekohler said:
    Hi, this is my app.

    I can see you're new to this Forum. Did you build your code on any specific toolset to get started? I'm think of making something too and you're experience to this point could be helpful.

    How did you get started? I'm thinking of using AudioKit and the code examples there... likely Swift but maybe. I guess you have to pass the code through Apple to even get it tested. Is that true?

    I'm curious if various frameworks make the API bolt-ons less work:

    AudioBus - the host here.
    IAA - Apple's docs
    AU - Apple's docs

    What MIDI API's are you using? CoreMIDI? Virtual MIDI, etc?

    Any clues would be appreciated. I will test your App. So, far I find the Look to be nicely polished and appreciate the "dark look" at this time of night.

    The majority of the code is the sequencer / function logic, and the UI...written in Objective-C. I use MIDIBUS for sending MIDI and a modified AudioKit for the sound engine. Apple only has to test it if you want to submit it to the App Store or release a beta. You can write your own apps and install for free with no review on your iDevice. The only catch is that it will expire after a while, so you then have to re-build the code and upload again.

    It's daunting at first, but not too bad. My first app was Collider, which was an intelligent randomizer for the Rytm. I started with one ugly button on the screen that when pressed, would send out a single CC value. Once I knew I could send MIDI, I just started building upon that. Start small.

    Well thanks mate, I’ve been up all bloody night playing with this thing. It’s incredible! Am I being daft asking for a swing function or can you replicate that using the phase offsets?

  • Also if you feed ReSlice from it in pitch mode, you get some crazy stuff!

  • So in love with this. :heart:

  • @mekohler said:
    It's daunting at first, but not too bad.

    So to allow testing of your App and use TestFlight you had to submit the App to Apple, right? I'm curious about that process if you could share any details.

    I have Xcode installed after finding more primary storage... apparently it won't run from an external disk.

    I hope your story inspires more to consider making something.
    I was hesitant to code in Apple's "Walled Garden" with languages that only work on IOS and OS X and only run on Apple devices...

    But.

    IOS has captured my musical imagination due to the price/value of the Apps available and the key are the great developers making incredible tools for us all. Love or Hate Disneyland but it's the Happiest Place on Earth, right? I'm just a big kid.

  • edited November 2018

    @McDtracy said:
    I was hesitant to code in Apple's "Walled Garden" with languages that only work on IOS and OS X and only run on Apple devices...

    You could do as I do: only use proprietary "Apple" code for the proprietary parts of the app (e.g. UI, interacting with system frameworks, etc.) and write the rest in bog-standard C++. That way the parts of your app which are most valuable are easily ported to other platforms later on. Xcode lets you mix-and-match languages inside a project without hassle.

Sign In or Register to comment.