Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Odessa (LIVE - IT FINALLY HAPPENED!)

1246738

Comments

  • @ChrisG said:

    @brambos said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:
    Does this actually require iOS 11 @brambos? I'm firmly stuck in iOS 10 land:/.

    Yes, this is a new feature Apple have introduced in iOS 11.

    I haven't kept up with iOS lately, is it midi via AU that's iOS 11?

    Until iOS 11 Audio Unit Instruments could only receive MIDI. Now they can also send MIDI back to the host. The host can then decide what to do with it. Garageband will record it into the current track (as shown in their WWDC video). AUM routes it to whatever destination you want.

    I'm also expecting Audiobus to implement it at some point, allowing AU plugins in their MIDI OUTPUT slots. It would make a lot of sense.

    Ah I see, that does clear up a few things, thanks. Well, my iPad Air isn't getting any younger and my iPhone got a large crack across the entire screen, few bits n pieces missing, some internal stuff like light sensor and whatnot isn't working. So I guess it's probably time to upgrade to new devices anyways:-)

    Bram, would you be willing to guesstimate the chances of Odessa going the stand alone route at some point in the future? And if it did, would it work with iOS < 11?

  • edited October 2017

    @brambos said:
    Out of courtesy to our good friend Matt @nanostudio I'm changing my projectname to Odessa. Although I wasn't aware of it, Matt had chosen the Obsidian name a long time ago already.

    Great! Respect to you ! Btw like new name, my money can’t wait to Odessa :-)

  • @syrupcore said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:
    Does this actually require iOS 11 @brambos? I'm firmly stuck in iOS 10 land:/.

    Yes, this is a new feature Apple have introduced in iOS 11.

    I haven't kept up with iOS lately, is it midi via AU that's iOS 11?

    Until iOS 11 Audio Unit Instruments could only receive MIDI. Now they can also send MIDI back to the host. The host can then decide what to do with it. Garageband will record it into the current track (as shown in their WWDC video). AUM routes it to whatever destination you want.

    I'm also expecting Audiobus to implement it at some point, allowing AU plugins in their MIDI OUTPUT slots. It would make a lot of sense.

    Ah I see, that does clear up a few things, thanks. Well, my iPad Air isn't getting any younger and my iPhone got a large crack across the entire screen, few bits n pieces missing, some internal stuff like light sensor and whatnot isn't working. So I guess it's probably time to upgrade to new devices anyways:-)

    Bram, would you be willing to guesstimate the chances of Odessa going the stand alone route at some point in the future? And if it did, would it work with iOS < 11?

    At this point unlikely. My experience is that making something standalone takes substantially more work than making something work as an AU (often 3x more). If I wanted to make this available standalone on iOS versions < 11 it would mean an almost complete rewrite as most of the code (and the UI) relies on things that are quite specific to AU.

    I prefer investing that time into adding more sequencer types to the suite. And there are plenty of standalone MIDI sequencers already around, so the added value as a standalone isn't immediately clear.

  • @brambos said:

    @syrupcore said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:
    Does this actually require iOS 11 @brambos? I'm firmly stuck in iOS 10 land:/.

    Yes, this is a new feature Apple have introduced in iOS 11.

    I haven't kept up with iOS lately, is it midi via AU that's iOS 11?

    Until iOS 11 Audio Unit Instruments could only receive MIDI. Now they can also send MIDI back to the host. The host can then decide what to do with it. Garageband will record it into the current track (as shown in their WWDC video). AUM routes it to whatever destination you want.

    I'm also expecting Audiobus to implement it at some point, allowing AU plugins in their MIDI OUTPUT slots. It would make a lot of sense.

    Ah I see, that does clear up a few things, thanks. Well, my iPad Air isn't getting any younger and my iPhone got a large crack across the entire screen, few bits n pieces missing, some internal stuff like light sensor and whatnot isn't working. So I guess it's probably time to upgrade to new devices anyways:-)

    Bram, would you be willing to guesstimate the chances of Odessa going the stand alone route at some point in the future? And if it did, would it work with iOS < 11?

    At this point unlikely. My experience is that making something standalone takes substantially more work than making something work as an AU (often 3x more). If I wanted to make this available standalone on iOS versions < 11 it would mean an almost complete rewrite as most of the code (and the UI) relies on things that are quite specific to AU.

    I prefer investing that time into adding more sequencer types to the suite. And there are plenty of standalone MIDI sequencers already around, so the added value as a standalone isn't immediately clear.

    Totally understand. I'd rather you make more sequencer types too! I'll get to 11 eventually.

  • Looks awesome. Cool new name. This may drag me onto 11 when it drops.

  • @syrupcore said:

    @brambos said:

    @syrupcore said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:

    @brambos said:

    @ChrisG said:
    Does this actually require iOS 11 @brambos? I'm firmly stuck in iOS 10 land:/.

    Yes, this is a new feature Apple have introduced in iOS 11.

    I haven't kept up with iOS lately, is it midi via AU that's iOS 11?

    Until iOS 11 Audio Unit Instruments could only receive MIDI. Now they can also send MIDI back to the host. The host can then decide what to do with it. Garageband will record it into the current track (as shown in their WWDC video). AUM routes it to whatever destination you want.

    I'm also expecting Audiobus to implement it at some point, allowing AU plugins in their MIDI OUTPUT slots. It would make a lot of sense.

    Ah I see, that does clear up a few things, thanks. Well, my iPad Air isn't getting any younger and my iPhone got a large crack across the entire screen, few bits n pieces missing, some internal stuff like light sensor and whatnot isn't working. So I guess it's probably time to upgrade to new devices anyways:-)

    Bram, would you be willing to guesstimate the chances of Odessa going the stand alone route at some point in the future? And if it did, would it work with iOS < 11?

    At this point unlikely. My experience is that making something standalone takes substantially more work than making something work as an AU (often 3x more). If I wanted to make this available standalone on iOS versions < 11 it would mean an almost complete rewrite as most of the code (and the UI) relies on things that are quite specific to AU.

    I prefer investing that time into adding more sequencer types to the suite. And there are plenty of standalone MIDI sequencers already around, so the added value as a standalone isn't immediately clear.

    Totally understand. I'd rather you make more sequencer types too! I'll get to 11 eventually.

    What Mr.Syrup really is trying to say is that you should make a full-on iOS DAW (but it needs to work with iOS 10!)

  • edited October 2017

    @brambos

    What a generous gesture to change the name of your coming sequencer to pay homage to the diner in the East Village where I spent many a teetering late night and many a white-knuckled early morning.

  • edited October 2017

    guys i will take your fantasies even one more step forward...

    imagine world, where Mr.BramBos & Mr.BlipInteractive are working together on iOS DAW :-) Imagine this... magine what they will be capable by joining their superpowers :)

  • You dog, you! I always believed someone could do this and I'm happy it's you!

  • @brambos I love using the makers to sequence everything. One feature that’s would be super great is if the note input part could be scale locked when I make individual adjustments to the notes.

  • Whenever I think of Odessa I think of this:

    I just watched it again and imagined trying to score it with brambos's sequencers .... just my brain ... I'll get my coat ;)

  • @brambos why did you break with your naming scheme?

  • @vpich said:
    @brambos why did you break with your naming scheme?

    This will be a very different type of app, not in the same family of products as my previous synths. It felt a bit awkward to force it into the "maker" nomenclature. If I make another synth, it will probably be a "...maker", but this is something that deserves a class of its own :)

  • @brambos said:
    It felt a bit awkward to force it into the "maker" nomenclature.

    Patternmaker?

  • @brambos said:

    @vpich said:
    @brambos why did you break with your naming scheme?

    This will be a very different type of app, not in the same family of products as my previous synths. It felt a bit awkward to force it into the "maker" nomenclature. If I make another synth, it will probably be a "...maker", but this is something that deserves a class of its own :)

    Fair enough, you’ve probably put in a lot of thought on this but I do think keeping uniformity would be good.

  • edited October 2017

    @vpich said:

    @brambos said:

    @vpich said:
    @brambos why did you break with your naming scheme?

    This will be a very different type of app, not in the same family of products as my previous synths. It felt a bit awkward to force it into the "maker" nomenclature. If I make another synth, it will probably be a "...maker", but this is something that deserves a class of its own :)

    Fair enough, you’ve probably put in a lot of thought on this but I do think keeping uniformity would be good.

    OCD ?
    Seems to matter very little to every other developer that ever existed.

  • I can see why some are interested in a deep project like a DAW but I respectfully disagree. We have decent daws already on iOS and they tend to accumulate design and technical debt, especially in early stages. What is great about the brambos approach is that he's always out in front on new wrinkles (eg MIDI out from AUx) in a relatively lightweight design with excellent user experience and good audio quality. We need this frontiersman!

  • @brambos sorry if you already anawered this, can these sequencers be triggered by a MIDI note, like the almighty Microtonic on desktop ?

  • @Samplemunch said:
    @brambos sorry if you already anawered this, can these sequencers be triggered by a MIDI note, like the almighty Microtonic on desktop ?

    They run in sync with the hosts timeline. Not triggered by MIDI, but you can select patterns using MIDI notes, which will then start and automatically play in sync with the host and all other instances of Odessa.

    Follow actions can be set to jump through the 8 patterns if you like.

    You can also use MIDI notes for transposing melodic patterns on the fly (+/- 1 octave).

  • edited October 2017

    Maybe a feature request for a future Odesa sequencer then ?
    Trigger via Note on
    Stop via Note off
    Change pattern via Note number
    It is amazing on Microtonic and feels like you are triggering sample loops.

  • @nrgb said:

    @brambos said:
    It felt a bit awkward to force it into the "maker" nomenclature.

    Patternmaker?

    or something a bit different while still similar

    Sequencebreaker

  • Will this app be of any use in GarageBand? (when it supports AU MIDI) as I understand it there woudn’t be a way to route the MIDI output to a synth

  • edited October 2017

    @brambos said:

    • transpose via MIDI notes
    • multiple instances (that's kind of the point)
    • obviously all the mutation/random stuff you already expected
    • patterns selectable via MIDI

    Dude - you're making dreams coming true here. I'm SUPER CHUFFED !

    As to : - "transpose via MIDI notes" thats great. Can I suggest an addition to this functionality, commonly found in hardware keyboard arrangers? - namely 2/3-finger left-hand chord/scale type triggering.
    What does this mean? Simple - If I want to play the pattern in A Major - In the lowest octave ( or a designated octave - or designated MIDI channel etc ) I play A and then a MAJOR 3rd above it. This instructs the arpeggiator to use the IONIAN mode in A - namely A Major. For A minor I play A and a minor third above it - © and the arpeggiator plays a Dorian mode ( the typical mode for minor ). For a dominant 7 ( often a Mixolydian mode ) I'd play A and above it - G ( the minor 7th interval in a dominant 7 chord ) - for augmented i'd play A and E# and so on. It might be worth googling and getting a copy of a PDF manual from their support page on their product - website of one of the common arrangers ( PSR/Korg PA series etc ) which will tell you the typical fingerings that such arrangers use.

    Doing this would be MEGA for live performance.

    Also - someone might have mentioned this - but to be able to queue up the NEXT root ( or chord ) to be played to have it play on the next 1st beat of the bar ( or Ableton-Link Quantum ) in Ableton fashion would be uber - cool.

    And and and.... of course - in the future - being able to sequence chord changes like ChordFlow does - ACROSS INSTANCES OF THE SEQUENCER PLUGIN would be awesome.

    Lastly - this all does clearly suggest that it might be wise for developers of such sequencers to liase and knock heads together in order to hash out a commonly agreed standard and "protocol" for sending root-chord-change instructions to whatever collection of arpeggiators or pattern sequencers happen to be running in a live-jam situation. On a more sophisticated note -there is also a need to be able to send chord-chances "up-front" to any app that is designed to do advanced MIDI-reharmonisation - because for musicality it is not enough to merely transform pattern notes on the fly - as they are about to be played. Why ? because for maximum harmonic musicality it is necessary to know the WHOLE pattern, or at least a whole BAR of a pattern in advance in order to best transpose the melodic arc in a musical manner. ( hope this makes sense - i wrote code to do this kind of transformation over a decade ago )
    And then i'd actually suggest submitting this standard - and even shared middleware code - to Ableton - for them to simply incorporate into an Ableton-Link V2 spec!. I know they're interested in the idea since i discussed it with one of them on PM a while back.

    Great time to be coming into iPad music !

  • edited October 2017

    One more thing - of course- and we've discussed it before i believe - Apple doesnt allow true AUv3 MIDI Effect plugins yet in iOS - or not at least officially in their OS.

    But do you think there might be some workaround possible between you - Kymatica ( AUM ) and other midi effect devs - to let us write MIDI effect AUs to run in AUM ? or do we just have to wait for Apple developers to get their fingers out and work on this ? ( hopefully not involving waiting until iOS 12 :( )

  • Thanks for the helpful suggestions! :)

  • @nonchai said:
    One more thing - of coure- and we've discussed it before i believe - Apple doesnt allow true AUv3 MIDI Effect plugins yet in iOS - or not at least officially in their OS.

    Yes, these AUv3 MIDI plugins can receive and send MIDI so you could build any MIDI effect you like. No proprietary hacks necessary.

  • @brambos looks great! does the AUv3 MIDI output spec also allows MIDI CC automation of AUv3 parameters?

  • @Artefact2001 said:
    @brambos looks great! does the AUv3 MIDI output spec also allows MIDI CC automation of AUv3 parameters?

    Sure, the AUv3 can receive (and send) MIDI CC and translate that internally to its AU Parameters. That was already possible - all my plugins already did that.

  • @brambos said:

    @Artefact2001 said:
    @brambos looks great! does the AUv3 MIDI output spec also allows MIDI CC automation of AUv3 parameters?

    Sure, the AUv3 can receive (and send) MIDI CC and translate that internally to its AU Parameters. That was already possible - all my plugins already did that.

    I guess I might be asking if your MIDI sequencers will allow 'drawing in' of CC data?

Sign In or Register to comment.