Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

AudioVeek Piano Roll MIDI sequencer AU

145791017

Comments

  • Awesome idea!

  • @Rodolfo said:

    @blueveek said:

    @soundshaper said:

    @blueveek said:

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

    I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.

    Looping happens automatically, based on a predefined number of bars ("duration") you've set up beforehand. I like the idea of auto-growing this duration though. Sounds like there should be an option, between either:
    1. Auto-growing duration by 1 bar while recording and playhead goes past the predefined duration.
    2. Auto-looping back to start while recording. This allows building-up drum patterns and is a common workflow.

    How does that sound?

    Sorry I’m interfering in the conversation but it sounds great to me!
    If you don’t know the loop length in advance the default “auto-growing” duration could be just 1 bar and it will work as Ableton.

    Excellent, that's what I suspect as well. I'll try to squeeze it in for release.

  • @blueveek said:
    @j_liljedahl Can AUM be notified to update the name it displays under plugin icons, based on the audioUnitShortName the plugin sends? Does it check it for each instance, and/or can it be updated at any moment?

    It would be nice to be able to name each instance of this Piano Roll, instead of all of them looking the same on every AUM track.

    Yes, AUM listens on changes for this property. A plugin can do like this:

    - (void)updateShortName:(NSString*)name {
        [self willChangeValueForKey:@"audioUnitShortName"];
        _shortName = name;
        [self didChangeValueForKey:@"audioUnitShortName"];
    }
    
    - (NSString*)audioUnitShortName {
        return _shortName;
    }
    
  • @j_liljedahl said:

    @blueveek said:
    @j_liljedahl Can AUM be notified to update the name it displays under plugin icons, based on the audioUnitShortName the plugin sends? Does it check it for each instance, and/or can it be updated at any moment?

    It would be nice to be able to name each instance of this Piano Roll, instead of all of them looking the same on every AUM track.

    Yes, AUM listens on changes for this property. A plugin can do like this:

    Fantastic!

  • @blueveek said:

    @soundshaper said:

    @blueveek said:

    @soundshaper said:
    @blueveek This looks awesome, exactly what I've been waiting for... MIDI looping inside of AUM or AB with real-time recording and clip launching!

    So just to be clear, I'd want to use this also with external synths. So say I have my Sub 37 connected and set the input and outputs to the external ports... Can I start recording a loop into your plugin by punching in on the beat (like you said quantized launch) then record notes and knob CCs as long as I want (let's say 12 bars) then punch out and have it immediately start looping? If so this is genius.

    The way it's set up, "punch in/out" works a little differently than what you described, and I also think you have a very valid use-case listed here. This plugin goes to great lengths to be absolutely synced up with the host, so once the host starts playing, the playhead matches. If the host pauses, playhead remains where the host is paused etc. If the host moves its own playhead (if one exists), this plugin matches it etc.

    All of this is, of course, modulo looping duration (literally). So if the host's playhead is at bar 9, while the piano roll clip's duration is set to 4 bars, then: host time is bar 9, plugin time is bar 1 etc.

    Now of course, you can also start playing notes after the loop repeats, which get recorded normally, and which seems to me like it matches your use-case, but please correct me if I'm wrong.

    Auto-extending loop duration while recording, then "punching out" to start playing, might be tricky. The question is when to stop recording and start playing? The easiest approach is to listen some CC which says "I'm done recording, start playing". I think this is a valid approach, and curious to know your thoughts.

    @soundshaper said:
    May I also suggest that there is a way to turn off MIDI input monitoring (if it is even needed) so that external synths can stay in Local On mode.

    Really looking forward to this!

    So thru off?

    I think I may have used the terms punch in and punch out incorrectly. I was just confirming the basic function is to press record and start recording MIDI into the piano roll, the press record again at any time when finished (the loop is quantized to the next bar) and it will begin looping? Essentially that, like Ableton, you don’t have to predefine the loop length.

    Looping happens automatically, based on a predefined number of bars ("duration") you've set up beforehand. I like the idea of auto-growing this duration though. Sounds like there should be an option, between either:
    1. Auto-growing duration by 1 bar while recording and playhead goes past the predefined duration.
    2. Auto-looping back to start while recording. This allows building-up drum patterns and is a common workflow.

    How does that sound?

    1. Yes, basically if you do not predefine a loop length, then it keeps adding a bar as you go past the next bar while recording, perfect. And yes, this is the same as Ableton which is ideal for real-time recording your improvisations because many times you don’t know how long your phrase will be.
    2. I would think the most logical way to dot his is to have an Overdub switch, which when ON, will loop back to start and continue recording when you press Record/stop record. Then the next time you press Record/stop record it will actually stop recording. So in this mode, you press twice when done recording, because the first time it sets the loop length and starts overdubbing. Note that in this mode, the auto+growing duration would be inactive as the loop length is set by the first punch-out.

    Really appreciate your interest in our comments and suggestions. This app is really needed and could be a game-changer in the workflows of many of us!

  • @j_liljedahl Does AUM support AUScheduleMIDIEventBlock? Looks like it isn't null, but seems to behave like a black hole.

    If it does work, what sample time does it expect? Does it support AUEventSampleTimeImmediate?

  • @blueveek said:
    @j_liljedahl Does AUM support AUScheduleMIDIEventBlock? Looks like it isn't null, but seems to behave like a black hole.

    If it does work, what sample time does it expect? Does it support AUEventSampleTimeImmediate?

    Yes, Rozeta uses both.

  • edited March 2019

    @brambos said:

    @blueveek said:
    @j_liljedahl Does AUM support AUScheduleMIDIEventBlock? Looks like it isn't null, but seems to behave like a black hole.

    If it does work, what sample time does it expect? Does it support AUEventSampleTimeImmediate?

    Yes, Rozeta uses both.

    Are there any resources on this except for the headers? Does it automatically deliver the MIDI? Or does it schedule an event trigger internalRenderBlock later?

  • Wait, my bad: AUScheduleMIDIEvent is only called by hosts, not by plugins. Plugins invoke the MIDIOutput block directly.

  • edited March 2019

    @brambos said:
    Wait, my bad: AUScheduleMIDIEvent is only called by hosts, not by plugins. Plugins invoke the MIDIOutput block directly.

    Interesting, that was my approach so far. However, I'm wondering if there's a better way of scheduling (relatively distant) future MIDI events that's not buffer-size dependent.

    E.g. a buffer size of 1024 samples means an AU can output midi events /every so often/, which might be too low resolution. I've experimented with longer offsets to the relative timestamp received by AUMIDIOutputEventBlock, but no dice. Apple docs also suggest that AUEventSampleTimeImmediate can only be used with a "small 4096 sample offset".

  • You can time everything with sample accuracy. Simply use timestamps. But for distant events you’ll have to buffer it yourself. No one said it was going to be easy :D

  • @brambos said:
    You can time everything with sample accuracy. Simply use timestamps. But for distant events you’ll have to buffer it yourself. No one said it was going to be easy :D

    Sure, it just seems that no matter what timestamp I use to output using AUMIDIOutputEventBlock, all MIDI is still delivered immediately (unless the relative timestamps are very small). I'll experiment some more and slip into your DMs if no luck :)

    Thanks for the info on AUScheduleMIDIEventBlock.

  • Ah there you go.. the timestamps should be absolute.

  • You can message me if you’re stuck, but I’m in Tokyo right now and I have a long ass flight coming up, so I may not be able to respond very quickly B)

  • @brambos said:
    You can message me if you’re stuck, but I’m in Tokyo right now and I have a long ass flight coming up, so I may not be able to respond very quickly B)

    No rush, thanks for helping out. Cheers!

  • Time for a new post in this series by Gene de Lisa?

  • edited March 2019

    @brambos said:
    You can time everything with sample accuracy. Simply use timestamps. But for distant events you’ll have to buffer it yourself. No one said it was going to be easy :D

    Thats what I do in my AU apps - maintain a fifo buffer for future events that I can't output in the current render block.
    Tricky part is maintaining timestamp order (e.g. midi files only have note on + duration - the midi note off needs to be futured). Add to this mix transposition, instancing ...
    Good thing is this fifo buffer then gives you a nice timestamp render.

    I don't use AUScheduleMIDIEventBlock myself, preferring to maintain it myself but Audiokit has an example of it using AUEventSampleTimeImmediate

  • Am I the only one who finds the devs talking gibberish to each other strangely entertaining :)

  • May I just say that seeing developers (potential competitors) helping each other warms my heart... :)

  • @lasselu said:
    May I just say that seeing developers (potential competitors) helping each other warms my heart... :)

    Individually we try and make up for Apple's silence. I learnt most of my AU from Gene de Lisa :)

  • It is precisely this, developers ensuring their products are compatible with one another with consistent coding, that makes these tools better for everyone.

  • @blueveek said:

    @brambos said:

    @blueveek said:
    @j_liljedahl Does AUM support AUScheduleMIDIEventBlock? Looks like it isn't null, but seems to behave like a black hole.

    If it does work, what sample time does it expect? Does it support AUEventSampleTimeImmediate?

    Yes, Rozeta uses both.

    Are there any resources on this except for the headers? Does it automatically deliver the MIDI? Or does it schedule an event trigger internalRenderBlock later?

    http://devnotes.kymatica.com/ios_midi_timestamps

  • edited March 2019

    @j_liljedahl said:

    @blueveek said:

    @brambos said:

    @blueveek said:
    @j_liljedahl Does AUM support AUScheduleMIDIEventBlock? Looks like it isn't null, but seems to behave like a black hole.

    If it does work, what sample time does it expect? Does it support AUEventSampleTimeImmediate?

    Yes, Rozeta uses both.

    Are there any resources on this except for the headers? Does it automatically deliver the MIDI? Or does it schedule an event trigger internalRenderBlock later?

    http://devnotes.kymatica.com/ios_midi_timestamps

    Thanks, I've already implemented all of this with AUMIDIOutputEventBlock :)

    My original question was whether the magical AUScheduleMIDIEventBlock can be used by AUs instead of AUMIDIOutputEventBlock (in combination with AUEventSampleTimeImmediate), for more ergonomic code, but I guess not based on @brambos's earlier answer.

  • @lasselu said:
    May I just say that seeing developers (potential competitors) helping each other warms my heart... :)

    It is fascinating and it makes me think, when we all grab the pitchforks and yell ‘MOAR FEATURES!!’ I think we tend to forget just how much work goes into even the most mundane of things. I think it illustrates just how much work goes into this stuff. Maybe next time we all start getting impatient and demanding features get added to our favorite apps, we should pause and remember that in a lot of cases it’s just one person coding all this. We lucky to get what we get, there’s been some outstanding apps on iOS and we wouldn’t have shit without these guys.

    P.s @brambos @j_liljedahl @midiSequencer @blueveek where would you guys recommend starting if anyone were interesting in learning to code midi/audio apps? Is prior coding experience essential or could you learn specifically to do just that specific aspect and learn along the way?

  • edited March 2019

    @Mull not wanting to discourage you, but realtime MIDI and especially Audio programming is among the hardest stuff you'll find in an already quite "extraterrestrial" field (software engineering), so be warned ;) you'll need to have quite good grasp of general programming concepts, data structures, know several programming languages (C, Objective-C, and for efficient code, ARM assembly as icing on the cake ;)), and need to learn all the relevant specialized APIs like Audio Units, CoreMIDI, etc., whose official documentation from Apple ranges from "meh" to "docuWHAT?" ;)

  • @SevenSystems said:
    @Mull not wanting to discourage you, but realtime MIDI and especially Audio programming is among the hardest stuff you'll find in an already quite "extraterrestrial" field (software engineering), so be warned ;) you'll need to have quite good grasp of general programming concepts, data structures, know several programming languages (C, Objective-C, and for efficient code, ARM assembly as icing on the cake ;)), and need to learn all the relevant specialized APIs like Audio Units, CoreMIDI, etc., whose official documentation from Apple ranges from "meh" to "docuWHAT?" ;)

    @SevenSystems I would have guessed as much given the complexity of the timings and all that arcane black magic!

  • @SevenSystems I would have included you on the list of resident dev gurus but didnt think you we’re on the thread mate! Love Xequence,it’s my daily driver 👍🏻

  • @Mull said:
    @SevenSystems I would have included you on the list of resident dev gurus but didnt think you we’re on the thread mate! Love Xequence,it’s my daily driver 👍🏻

    Thanks, much appreciated :) I'm lurking on every thread here with one of my 183 personalities, so be warned ;)

  • @SevenSystems said:

    @Mull said:
    @SevenSystems I would have included you on the list of resident dev gurus but didnt think you we’re on the thread mate! Love Xequence,it’s my daily driver 👍🏻

    Thanks, much appreciated :) I'm lurking on every thread here with one of my 183 personalities, so be warned ;)

    Meh..I’m too pretty to need a personality 🤣 Btw u still thinking of making Xequence an AU host?

  • @Mull said:

    @SevenSystems said:

    @Mull said:
    @SevenSystems I would have included you on the list of resident dev gurus but didnt think you we’re on the thread mate! Love Xequence,it’s my daily driver 👍🏻

    Thanks, much appreciated :) I'm lurking on every thread here with one of my 183 personalities, so be warned ;)

    Meh..I’m too pretty to need a personality 🤣 Btw u still thinking of making Xequence an AU host?

    It's still listed as a "want", yeah... especially since the whole mixer user interface is in place anyway (from the development of the Xequence-based DAW), so "only" the actual AU hosting infrastructure would need to be added... don't take it as a promise though... :innocent:

Sign In or Register to comment.