Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

New App - PolyPhase - Beta is open - not mine but seems very cool -

1235713

Comments

  • Neat app!

    UI nits: I notice that the transpose arrows are very close to the horizontal nav bar thing on the bottom of the new all-screen iPads. The entire bottom panel full of labeled actions seems unfinished. I hope it will take on the UI style of the rest of the app.

    Want: I want AUv3.

  • Yep, I will make sure the UI works for the new iPads. I only own a 9.7 and 10.5...I will optimize for larger sizes right before the official release.

  • @reasOne said:

    @mekohler said:
    CC Controls of important parameters is a very good idea!
    I'll add those to control all of the Track properties (right side of a Track), important Synth parameters, and some CC's to toggle some of the randomization functions below.

    Awesome!! This is what I was going to request... And be able to send each sequence out to a different channel or cc of the same or different synths.
    Really loving this so far! It's fun and really does a good job at generating useful sequences...

    Please don’t hard code midi cc’s! Allow midi learn, if you go AU use AU params, or have a dialog to choose cc mapping... Hard coding is bad ®️

  • edited December 2018

    @mekohler said:

    @McDtracy said:

    @mekohler said:
    Hi, this is my app.

    I can see you're new to this Forum. Did you build your code on any specific toolset to get started? I'm think of making something too and you're experience to this point could be helpful.

    How did you get started? I'm thinking of using AudioKit and the code examples there... likely Swift but maybe. I guess you have to pass the code through Apple to even get it tested. Is that true?

    I'm curious if various frameworks make the API bolt-ons less work:

    AudioBus - the host here.
    IAA - Apple's docs
    AU - Apple's docs

    What MIDI API's are you using? CoreMIDI? Virtual MIDI, etc?

    Any clues would be appreciated. I will test your App. So, far I find the Look to be nicely polished and appreciate the "dark look" at this time of night.

    The majority of the code is the sequencer / function logic, and the UI...written in Objective-C. I use MIDIBUS for sending MIDI and a modified AudioKit for the sound engine. Apple only has to test it if you want to submit it to the App Store or release a beta. You can write your own apps and install for free with no review on your iDevice. The only catch is that it will expire after a while, so you then have to re-build the code and upload again.

    It's daunting at first, but not too bad. My first app was Collider, which was an intelligent randomizer for the Rytm. I started with one ugly button on the screen that when pressed, would send out a single CC value. Once I knew I could send MIDI, I just started building upon that. Start small.

    Whoa! You’re the genius that did Collider! If you can figure out Elektron’s midi as well as you did, everything else in life is going to be easy! 😜

    Now I can run Collider for Rytm and this for AK!

    I’d like to see a setting in each track for gate length as a percent from 1% to at least 150% so you can have ties and legato for slides.

    It appears that slave track steps never get removed do they? If master is at 100%, and I start slaves blank, eventually they fill up but steps are never removed. Is that by design?

    Also, here’s a video of what may be a bug or me just not understanding how things work... Note that track one almost never gets notes. If I restart entire sequencer occasionally it will throw one in, but if it doesn’t get one on first hit of master, it never gets one... bug or feature?

    It also might be cool to have a way to limit or set the note range for the master track, not just scale...

  • @mekohler said:
    Yep, I will make sure the UI works for the new iPads. I only own a 9.7 and 10.5...I will optimize for larger sizes right before the official release.

    I think your user interface design is great. What API did you use the make It? A few developers make their own to be unique and i’m Sure that’s another level of difficulty beyond the AU, IAA and multiple MIDI interfaces together right.

    I love free standards and open source code kits.

  • edited December 2018

    Notes only get removed if a REST is created...the probability that a rest is created is the percentage on the Master Track labeled "RST". The Slave Tracks will choose to accept or deny the REST just like a normal note, using it's %. Changing the RST and RPT values on the Master will greatly alter the generated melodies.

    In general I wouldn't worry too much about trying to predict what will happen, as there are so many probabilities and values working together (Master %, Master RST %, Master RPT %, Slave %, Slave Offset)...this structure makes it hard to wrap your head around what exactly will happen, but you get musical results.

    If your Slave %'s are low, then they will almost never accept the Master Tracks's note / rest...Since everything is probability based, you will have some bars where nothing is happening...just increase all %'s (not RST or RPT) slightly if it's too sparse for you.

  • @McDtracy said:
    What API did you use the make It?

    It's all custom code subclassing Apple's UIView and CALayer, although I did already have some elements from older projects.

  • @mekohler said:

    @McDtracy said:
    What API did you use the make It?

    It's all custom code subclassing Apple's UIView and CALayer, although I did already have some elements from older projects.

    I’ll bet you spent most of the time developing that aspect of the app unless you started with that skill set. GUI’s are hard to get right esp with 7 to 12 inch displays to get right.
    Any advice? Will you keep rolling your own UI’s?

    (I keep pumping for newbie dev clues) OT? Maybe but it keeps the thread active too so that’s OK.

  • mekohler not to jump the gun or anything and I'm very much looking forward to your Polyphase release, but I'd just like to well ahead of the rush make a special request for your next app to be a drumsampler that includes all of the elektron tricks you know and transient detection sample slicing as part of it's feature set, would be a dream come true.

  • Here is the help text describing the new Note Filtering that will be available:

    • At the top is your Global Key / Scale
    • Below are various Chord Keys and Scales you can toggle
    • If both a Chord Key and Scale are selected, then this Chord is passed to the Global Key / Scale, where it will be filtered again if Filter Chords is toggled
    • If only a Chord Key is selected, then only the Global Key / Scale is applied
    • If only a Chord Scale is selected, then the Global Scale will use the Chord Scale

    This might sound confusing, but it lets you have the option to always be in Key while playing any chord, or to do whatever you want and modulate to other Keys and go wild.

  • @MonkeyDrummer said:
    Please don’t hard code midi cc’s! Allow midi learn, if you go AU use AU params, or have a dialog to choose cc mapping... Hard coding is bad ®️

    CC mapping will be selectable when it's implemented (similar to the MIDI channels)...maybe MIDI learn after that.

  • edited December 2018

    Hey @mekohler
    Did you use AudioKit for the Synth/Fx section? (it sounds/behaves similarly)
    If so, send us a note to [email protected] and we’ll blog about it
    🤘

  • Another suggestion... Make an option so that it doesn't register controls until you lift your finger. Then you can "cue" up a transpose using right slider for example and it won't play all the steps on the way to -12, +7, etc. Make sense?

  • @analog_matt said:
    Hey @mekohler
    Did you use AudioKit for the Synth/Fx section? (it sounds/behaves similarly)
    If so, send us a note to [email protected] and we’ll blog about it
    🤘

    Hi! Yes, the synth engine uses AudioKit! I will send you an email when it's ready for release :#

  • @MonkeyDrummer said:
    Another suggestion... Make an option so that it doesn't register controls until you lift your finger. Then you can "cue" up a transpose using right slider for example and it won't play all the steps on the way to -12, +7, etc. Make sense?

    Yep, that's a very good idea.

  • @mekohler said:

    @MonkeyDrummer said:
    Another suggestion... Make an option so that it doesn't register controls until you lift your finger. Then you can "cue" up a transpose using right slider for example and it won't play all the steps on the way to -12, +7, etc. Make sense?

    Yep, that's a very good idea.

    YES!

  • In the help text, it mentions a camera icon to switch between snapshots and a waveform view?
    Is this part of the obsolete help text you mentioned, or do I just not see the camera icon?

  • @CracklePot said:
    In the help text, it mentions a camera icon to switch between snapshots and a waveform view?
    Is this part of the obsolete help text you mentioned, or do I just not see the camera icon?

    Obsolete help text, will be fixed in the next beta (almost done)

  • @mekohler I think Propellerheads have copied your app !

  • edited December 2018

    @Jumpercollins said:
    @mekohler I think Propellerheads have copied your app !

    Noooooooo....well, at least it's not for iOS :'(

    Oh well, I don't know how their generative melody works. I really like the approach I took with the phase offsets and probability, but we'll see.

    Beta 2 is pretty much done, and it's what Beta 1 should have been. Most of the requests from this thread have been implemented, and everything feels much more consistent and solid. It's like a brand new app :)

  • @mekohler said:

    @Jumpercollins said:
    @mekohler I think Propellerheads have copied your app !

    Noooooooo....well, at least it's not for iOS :'(

    Oh well, I don't know how their generative melody works. I really like the approach I took with the phase offsets and probability, but we'll see.

    Beta 2 is pretty much done, and it's what Beta 1 should have been. Most of the requests from this thread have been implemented, and everything feels much more consistent and solid. It's like a brand new app :)

    Great looking forward to testing it, just been playing with Aparillo Using PolyPhase the 2 go together great.

  • @mekohler Found an issue I think...

    So I've PP routed into AUM driving 3 instances of iSEM + FX, eq's etc.
    When AUM is in the foreground everything is fine and CPU @ 1024 samples is like 50%
    When I switch to PP, AUM starts sputtering and crackling. I've even had it totally kill the iOS audio (similar to another person post earlier that requires a restart of iPad).
    Switch back to AUM, everything is fine. Switch to another IAA like Shoom, everything is fine.
    Culprit seems to be PolyPhase. Have PP's synth turned off as well.

    Also, clock receive doesn't seem to work. I've tried various things like sending clock out from AUM (the thing he built in for hardware that he warnes about using with software but works fine in my experience for non Link-enabled devices). I've also tried a couple link to midi clock apps. PolyPhase just sits there with the red stop button activated and play button off like it "wants" to receive clock, but no combo of clock in options, other than internal clock, seem to work.

  • Clock receive was broken last beta, works in the next.

    I have never encountered any dropped audio issues, but I did change some internal things. In the past I was still sending note-offs to the engine, and when you minimize the app I shut off the AudioKit engine...maybe those note-offs still being sent to it were causing trouble. I am having 0 issues with the latest version, so let's wait and see. Sorry about that :[

  • @Jumpercollins said:
    @mekohler I think Propellerheads have copied your app !

    That seems more like Xynthsizer really...

  • Hope that Propellerheads app comes to iOS too, there's plenty of room in the pool for this kind of thing :)

  • Well done

  • @mekohler I gotta beg again on adding a gate/note length option. Like being able to have 1n for track “speed” but have like 8n for note length. Or 4n for speed and 1n for length to build chords with notes that come in and out...

    Pretty please 🥺 ?

  • @MonkeyDrummer said:
    @mekohler I gotta beg again on adding a gate/note length option. Like being able to have 1n for track “speed” but have like 8n for note length. Or 4n for speed and 1n for length to build chords with notes that come in and out...

    Pretty please 🥺 ?

    I know what you mean, but don't have a solution yet. When I use it with Ableton, I add a note length MIDI device which extends all notes.

  • @MonkeyDrummer said:
    @mekohler I gotta beg again on adding a gate/note length option. Like being able to have 1n for track “speed” but have like 8n for note length. Or 4n for speed and 1n for length to build chords with notes that come in and out...

    Pretty please 🥺 ?

    Really - this would be amazing if you can find a way?
    Ive been using the app constantly and gate / note length would be my no1 request.
    ...chords with notes that come in and out... ;)

  • @Mayo said:

    Really - this would be amazing if you can find a way?
    Ive been using the app constantly and gate / note length would be my no1 request.
    ...chords with notes that come in and out... ;)

    I have no doubt that his super-brain will figure it out. If you've used Collider with an Analog Rytm... The stuff he was able to get the Ryth to do and respond to is unreal. Elektron couldn't even get Overbridge working after like what, seems like 20 years. And Collider just made them look reeeeaaaaalllllyyyy lame. He did it on his own!

    So making notes longer? Yea, I think he's got that. :)

Sign In or Register to comment.