Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

MOZAIC - Create your own AU MIDI plugins - OUT NOW!

17172747677102

Comments

  • @brambos said:

    @soundtemple said:

    @motmeister said:
    Ok, @OnNewBar doesn’t/shouldn’t happen until HostBar changes. Technically it doesn’t “change” until the beginning of the second measure. What it looks like to me, is that the AUM transport doesn’t report the initial state of HostBar (or perhaps Mozaic doesn’t look for it) in time to be available when the first @OnMIDINoteOn occurs. It’s not about the order of events inside of Mozaic. It’s that HostBar doesn’t change in time for @OnMIDINoteOn to get its correct state.

    I’ve looked at this with logging several dozen different ways, and it’s the only conclusion I can come to. When Mozaic DOES become aware of a change in HostBar (does it get it from AUM?), one or more midinotes may have already been received. The condition isn’t consistent either, which also leads me to believe it’s a timing issue between AUM and Mozaic, NOT a failure in Mozaic.

    The script I’m writing depends on knowing which measure a note begins in, and this condition prevents its accuracy, so it would be helpful if someone could take a look. I’d be happy to share my script with a dev if it will help.

    Thanks!

    @motmeister Did you ever get a response from @brambos on this? I have a similar issue and believe that there is a problem where AUM and MOZAIC don’t communicate Hostbeat when host is stopped.

    So, hitting the reset HostBeat/Bar button in AUM does NOT update the Hostbeat value in Mozaic when transport is stopped (ie: after you hit the play/pause button to stop)

    Hitting AUM reset transport whilst in playback will reliably reset HostBeat in Mozaic.

    If check the log with this Little Mozaic script in AUM you can see HostBeat is reset when in playback but not when host is stopped.

    The annoying part about this is that its very difficult to start AUM and Mozaic on HostBeat = 0

    `

    @OnHostStart
    Log {---- started ---}
    @End

    @OnHostStop
    Log {---- stopped ----}
    @End

    @OnNewBeat
    Log {Hostbeat: }, HostBeat
    @End

    @OnNewBar
    Log {--- new bar ---}
    @End

    `

    I'm looking into this now. Would you (@soundtemple and @motmeister ) be interested in checking out a beta-version to see if the issue has been reliably fixed?

    Just to get the issue clear, it's not about receiving the @OnNewBar event when the transport isn't running, but about getting the correct value from the HostBar variable?

    Right. When it’s running, there doesn’t seem to be enough time for mozaic to query aum for the data. I’d like to help with a beta, but you might have to coach me into it. :)

  • @drewinnit said:
    @_ki you're a gentleman and a scholar. I really appreciate the time you've taken to help me.

    Your latest code was 99% there, I made two small changes - I inverted the led on and off values, so that the button is lit when it is unmuted, and I also shifted the index value up by 16 - like you mentioned it was affecting the wrong row of buttons. Full code pasted below as reference

    My only remaining question is whether you have a Paypal? I’d love to buy you some beer/coffee to show my appreciation. you've saved me hours of head scratching!

    @OnLoad
      SetShortName {LPLEDS} 
      ShowLayout 2
      LabelPads {Update Launchpad XL LEDs from CC#3  v1}
      LabelKnobs { }
      for knob = 0 to 3
        SetKnobValue knob, 0  
        LabelKnob knob, { }
      endfor  
    
      // LED Sysex 
      //       F0h    00h  20h  29h  02h  11h  78h Tmpl Idx  Val   F7h  
      sysButton[] = [0x00,0x20,0x29,0x02,0x11,0x78,0x00,0x00,0x00]
    
      // LED Color constants (see Launchcontrol XL Midi manual, top of page 4)
      LED_OFF = 62
      LED_ON  = 12   // YELLOW FULL  
    @End
    
    @OnMidiCC
      if MIDIByte2 = 3
        LatchPad MIDIChannel, (MIDIByte3 = 0)
        if MIDIByte3 = 0
          LabelPad MIDIChannel,{Muted}
        else
          LabelPad MIDIChannel, { } 
        endif      
    
        for template = 0 to 2
          sysButton[6] = template
          sysButton[7] = 24 + MIDIChannel // Button index with offset
          if MIDIByte3 = 0
            sysButton[8] = LED_ON
          else
            sysButton[8] = LED_OFF
          endif      
          SendSysex sysButton, 9
        endfor
      endif
    @End
    

    Good stuff!

  • @McD said:
    The Novation Launch Control XL ($149 Sweetwater) looks like it might be a handy hardware extension for iPad users:

    I did see a Mosaic script for the Novation Launchpad (several models available):

    I'd imagine I would use knobs and sliders more than PADs and 16 PADS would be enough for me.

    It might me nice to make a library of scripts to use these devices with our iPads via Mosaic.

    They would also make interesting HUI's for Mosaic scripting in general since i find on screen knob tweaking to be frustrating for live/recording uses. Having a physical knob that you can adjust slowly would insure a better result.

    On another note: has anyone connected a better App for on-screen "knobs/sliders" into Mosaic? Any recommendations?

    Any Editable Mozaic script for Launchpads?

    I wish there was an editable Mozaic script for Launchpad hardware. I would love color coded template to do it all.
    Notes CC And PC on different channels with toggle or momentary.
    I need it for:
    Mute unmute
    Trigger samples
    Play synths.
    Effects sends.

  • @brambos said:

    @soundtemple said:

    @motmeister said:
    Ok, @OnNewBar doesn’t/shouldn’t happen until HostBar changes. Technically it doesn’t “change” until the beginning of the second measure. What it looks like to me, is that the AUM transport doesn’t report the initial state of HostBar (or perhaps Mozaic doesn’t look for it) in time to be available when the first @OnMIDINoteOn occurs. It’s not about the order of events inside of Mozaic. It’s that HostBar doesn’t change in time for @OnMIDINoteOn to get its correct state.

    I’ve looked at this with logging several dozen different ways, and it’s the only conclusion I can come to. When Mozaic DOES become aware of a change in HostBar (does it get it from AUM?), one or more midinotes may have already been received. The condition isn’t consistent either, which also leads me to believe it’s a timing issue between AUM and Mozaic, NOT a failure in Mozaic.

    The script I’m writing depends on knowing which measure a note begins in, and this condition prevents its accuracy, so it would be helpful if someone could take a look. I’d be happy to share my script with a dev if it will help.

    Thanks!

    @motmeister Did you ever get a response from @brambos on this? I have a similar issue and believe that there is a problem where AUM and MOZAIC don’t communicate Hostbeat when host is stopped.

    So, hitting the reset HostBeat/Bar button in AUM does NOT update the Hostbeat value in Mozaic when transport is stopped (ie: after you hit the play/pause button to stop)

    Hitting AUM reset transport whilst in playback will reliably reset HostBeat in Mozaic.

    If check the log with this Little Mozaic script in AUM you can see HostBeat is reset when in playback but not when host is stopped.

    The annoying part about this is that its very difficult to start AUM and Mozaic on HostBeat = 0

    `

    @OnHostStart
    Log {---- started ---}
    @End

    @OnHostStop
    Log {---- stopped ----}
    @End

    @OnNewBeat
    Log {Hostbeat: }, HostBeat
    @End

    @OnNewBar
    Log {--- new bar ---}
    @End

    `

    I'm looking into this now. Would you (@soundtemple and @motmeister ) be interested in checking out a beta-version to see if the issue has been reliably fixed?

    Just to get the issue clear, it's not about receiving the @OnNewBar event when the transport isn't running, but about getting the correct value from the HostBar variable?

    Yes please... @brambos would love to test a beta version of this. I have invested my whole COVID experience building Mozaic tools and this is something that seems to be causing me some grief!!!

    In my recent experience in AUM with Ableton Link it seems AUM and other midi processors always start with the Host on beat1 whereas Mozaic appears to be continuing on from where it left off on occasion.

    PS: Sorry missed the response

  • edited June 2020

    @soundtemple said:

    @brambos said:

    @soundtemple said:

    @motmeister said:
    Ok, @OnNewBar doesn’t/shouldn’t happen until HostBar changes. Technically it doesn’t “change” until the beginning of the second measure. What it looks like to me, is that the AUM transport doesn’t report the initial state of HostBar (or perhaps Mozaic doesn’t look for it) in time to be available when the first @OnMIDINoteOn occurs. It’s not about the order of events inside of Mozaic. It’s that HostBar doesn’t change in time for @OnMIDINoteOn to get its correct state.

    I’ve looked at this with logging several dozen different ways, and it’s the only conclusion I can come to. When Mozaic DOES become aware of a change in HostBar (does it get it from AUM?), one or more midinotes may have already been received. The condition isn’t consistent either, which also leads me to believe it’s a timing issue between AUM and Mozaic, NOT a failure in Mozaic.

    The script I’m writing depends on knowing which measure a note begins in, and this condition prevents its accuracy, so it would be helpful if someone could take a look. I’d be happy to share my script with a dev if it will help.

    Thanks!

    @motmeister Did you ever get a response from @brambos on this? I have a similar issue and believe that there is a problem where AUM and MOZAIC don’t communicate Hostbeat when host is stopped.

    So, hitting the reset HostBeat/Bar button in AUM does NOT update the Hostbeat value in Mozaic when transport is stopped (ie: after you hit the play/pause button to stop)

    Hitting AUM reset transport whilst in playback will reliably reset HostBeat in Mozaic.

    If check the log with this Little Mozaic script in AUM you can see HostBeat is reset when in playback but not when host is stopped.

    The annoying part about this is that its very difficult to start AUM and Mozaic on HostBeat = 0

    `

    @OnHostStart
    Log {---- started ---}
    @End

    @OnHostStop
    Log {---- stopped ----}
    @End

    @OnNewBeat
    Log {Hostbeat: }, HostBeat
    @End

    @OnNewBar
    Log {--- new bar ---}
    @End

    `

    I'm looking into this now. Would you (@soundtemple and @motmeister ) be interested in checking out a beta-version to see if the issue has been reliably fixed?

    Just to get the issue clear, it's not about receiving the @OnNewBar event when the transport isn't running, but about getting the correct value from the HostBar variable?

    Yes please... @brambos would love to test a beta version of this. I have invested my whole COVID experience building Mozaic tools and this is something that seems to be causing me some grief!!!

    In my recent experience in AUM with Ableton Link it seems AUM and other midi processors always start with the Host on beat1 whereas Mozaic appears to be continuing on from where it left off on occasion.

    PS: Sorry missed the response

    The update is already live :) Please check with the latest version on the App Store if this already solves your issue. I think it might.

    About beat 1 versus starting where it left off: Mozaic always determines the current bar and beat on what AUM communicates. So if you don't rewind AUM, you won't start on beat 1. Any other behaviour would quickly become very unpredictable.

  • I'm wrapping my head around how to design a new script and could use help from the experts. Here's what I'm trying to do: assign eight specific notes to send some combination of 16 MIDI channels. So, something like this:

    C2 sends 1, 5, 7
    D2 sends 1, 2, 7
    F2 sends 3, 4, 6
    etc

    The notes are static in my use case. They are C2, G#2, D2, F#2, F2, D#2, D#3, C#3

    The combination of MIDI channels assigned to each note should be easily configured using a knob or slider.

    Thoughts?

  • edited June 2020

    @brambos any chance to expand mozaic (or create another plugin based on mozaic core) for audio processing ? Of course i can see how this wouldn't be too much CPU efficient, but it would be amazing playground for learnig DSP algorythm, understanding DSP basics and prototyping DSP algo ideas directly on iPad...

  • @brambos So, I have the new update loaded, but I still seem to have an issue when Ableton Link is enabled.
    With Ableton link enabled AUM will (cue the start and then) always start on beat 1. Other plugins like StepPolyArp and Atom piano roll also start on the 1 however Mozaic does not. It seems to carry on from where it left off.

    So, with Ableton Link off all 3 (Mozaic, SPA and Atom) stay in sync (perfect), however with Ableton Link enabled, Mozaic gets out of sync. Hopefully, this video demonstrates.

    https://photos.app.goo.gl/eUPWFq9YbqSBsL76A

  • @lukesleepwalker Thats a very specific use-case - at first i though you meant to distribute Rozeta pattern change notes onto specific channels (maybe to control several Rozeta sequencers at one), but then i noticed that your input notes differ.

    They also differ from the RuisMaker drum input notes - so what do these notes control and why do they need to be send out on different channel combinations ? Just to better understand what you want to achieve.

  • @soundtemple said:
    @brambos So, I have the new update loaded, but I still seem to have an issue when Ableton Link is enabled.
    With Ableton link enabled AUM will (cue the start and then) always start on beat 1. Other plugins like StepPolyArp and Atom piano roll also start on the 1 however Mozaic does not. It seems to carry on from where it left off.

    So, with Ableton Link off all 3 (Mozaic, SPA and Atom) stay in sync (perfect), however with Ableton Link enabled, Mozaic gets out of sync. Hopefully, this video demonstrates.

    https://photos.app.goo.gl/eUPWFq9YbqSBsL76A

    Thanks for the clear video. I can indeed reproduce it so I'll check what's different when Link is active.

  • edited June 2020

    @_ki said:
    @lukesleepwalker Thats a very specific use-case - at first i though you meant to distribute Rozeta pattern change notes onto specific channels (maybe to control several Rozeta sequencers at one), but then i noticed that your input notes differ.

    They also differ from the RuisMaker drum input notes - so what do these notes control and why do they need to be send out on different channel combinations ? Just to better understand what you want to achieve.

    I use the ACPad controller on my acoustic guitar to control "scenes" as I play a piece. Those are the notes assigned to the eight pads on the controller. I set up a series of scenes that use a bunch of samples in different instances of audiolayer and then turn on/off samples depending on which scene I select. I use AudioBus mutes to do this now but it's laborious and hard to remember which scene gets which samples as I must configure each AudioLayer channel individually. I'm thinking I could use AUM channel filter to select whether that instance of audiolayer plays during a selected scene. Much faster and easier to see all in one view.

  • _ki_ki
    edited June 2020

    @lukesleepwalker Wow the ACPad is cool controller B)

    .

    Are the notes the ‚standard‘ ones defined for the pads of the controller ? I didn‘t find a manual for that one, only a video showing the pc controller configuration software - so these notes mentioned above could also be user-defined.

    I don‘t know if i now fully understood everything. The note receivers are several AudioLayer instances,
    does each of them contain several samples (one for each of the notes (C2, G#2, D2, F#2, F2, D#2, D#3, C#3) or only a single sample (but then configured to play the sample with the sample ‚speed‘ regardless of the triggering note number ?

    If the controller sends a scene note, is that note ‚latched‘ until another scene pad is pressed, or is it just a trigger and the samples are latched in AudioLayer ?

    .

    Sorry, i still don‘t get how its supposed to work :)

    I'm thinking I could use AUM channel filter to select whether that instance of audiolayer plays during a
    selected scene. Much faster and easier to see all in one view.

    Currently you work in AudioBus and you configured a mute action for each of the AudioLayer instances ?

    But with such a note to specific channel script, you could use AUM‘s channel filters and each of the AudioLayer only listens a single channel ?

    .

    I might have some other ideas on how to solve the problem - but for that i need to ‚in-detail‘ understand what‘s planned, so i don‘t write any guesses that might be based on wrong assumptions.

  • @_ki said:
    @lukesleepwalker Wow the ACPad is cool controller B)

    .

    Are the notes the ‚standard‘ ones defined for the pads of the controller ? I didn‘t find a manual for that one, only a video showing the pc controller configuration software - so these notes mentioned above could also be user-defined.

    I don‘t know if i now fully understood everything. The note receivers are several AudioLayer instances,
    does each of them contain several samples (one for each of the notes (C2, G#2, D2, F#2, F2, D#2, D#3, C#3) or only a single sample (but then configured to play the sample with the sample ‚speed‘ regardless of the triggering note number ?

    If the controller sends a scene note, is that note ‚latched‘ until another scene pad is pressed, or is it just a trigger and the samples are latched in AudioLayer ?

    .

    Sorry, i still don‘t get how its supposed to work :)

    I'm thinking I could use AUM channel filter to select whether that instance of audiolayer plays during a
    selected scene. Much faster and easier to see all in one view.

    Currently you work in AudioBus and you configured a mute action for each of the AudioLayer instances ?

    But with such a note to specific channel script, you could use AUM‘s channel filters and each of the AudioLayer only listens a single channel ?

    .

    I might have some other ideas on how to solve the problem - but for that i need to ‚in-detail‘ understand what‘s planned, so i don‘t write any guesses that might be based on wrong assumptions.

    I believe I haven't painted a complete enough picture of how it works, so the gaps are creating confusion. Let me explain it fully:

    I "play" samples with my feet using FreeDrum bluetooth midi sensors (https://freedrum.rocks/). These send out four notes that trigger the samples in AudioLayer. (I use different random and round robin techniques to create evolving and varied arrangements but that's not germane to this explanation.) I play my acoustic guitar while triggering the samples with my feet. Most songs have somewhere between 4-10 different "instruments" that I set up in AudioLayer. So, if the first section of the song needs piano, synth bass, and pitched percussion, I'll activate the AudioLayer instances that include those samples. The next section of the song might include the piano, turn off the bass and percussion, and add in a vocal sample. These are the "scenes" that I've previously described.

    The ACPad pads emit notes (the ones I specified above) and I use those notes to trigger scenes. I currently do this in AudioBus by muting/unmuting AudioLayer instances, mapping each note from the ACPad to either mute or unmute that instance. It's quite laborious and I sometimes get mixed up as I try to recall which notes should be "on" for a given scene.

    My idea (and it may not be a good one! :smile: ) is instead to use the AUM MIDI channel filter to easily configure each song section/scene by setting the MIDI channel filters for each AudioLayer instance. So, ACPad note C2 is sent to a Mozaic script and Mozaic "enables" MIDI channel 1, 2, 3 but blocks all others. For the next scene, ACPad note G#2 is sent to Mozaic and Mozaic enables MIDI channel 3, 5, 7 but blocks all others.

    In my head, I designed this in Mozaic so that I "selected" the right note for each scene (incoming from the ACPad) using Mozaic pads, then tap shift to configure the MIDI channels that should be enabled. Tap shift to go back and select the next note for the next scene, tap shift to configure the MIDI channels that should be enabled, and so on...

    I hope that makes sense.

  • @lukesleepwalker I hope you find a solution to this problem. I figure if @_ki or @wim cant help you with this no one can. These guys are an incredible asset to us in the community.
    Good luck fellow mad scientist !

  • @hypnopad said:
    @lukesleepwalker I hope you find a solution to this problem. I figure if @_ki or @wim cant help you with this no one can. These guys are an incredible asset to us in the community.
    Good luck fellow mad scientist !

    Indeed, they are! Thanks for the good wishes.

  • _ki_ki
    edited June 2020

    I think i got now what the various components do, or ‚should‘ do:

    • Several audio layer instances with samples played by foot-controller (you didn‘t yet specify the notes send)
    • A ACPad controller sending out C2, G#2, D2, F#2, F2, D#2, D#3, C#3 to change which of the AL instances will receive the notes from the foot-controller

    But - the script you described above, would for instance send out the note C2 on ch 1,5 and 7 when C2 is received. How would this action change the foot-controller channels ?

    I could not find an input channel toggle action in AUM, only mute for the loaded plugin. But each mute would only react to a single note, so if you assign C2 of ch1 to mute the AL plugin in the AUM track, you can‘t unmute/mute it with another scene trigger note.

    It seems i am still missing a part of your idea on how it should work.

    .
    .

    Shouldn‘t it be the foot-controller notes that need to be routed through the script and the scene switch notes (also routed to the script) just change the ‚script internal routing‘ to the output channels for these foot-controller notes. When C2 is received, the foot-switch notes will be echoed to ch 1,5 and 7 (to all three of them if i got this correctly) . All AL instances would listen to this single script, but each of them only listens on a single channel. Which channel is activated for each of the trigger notes can be configured via GUI.

    For this it would be best, if the foot-controller sends its note on a different channel than the ACPad to make both types of notes easier to distinguish. (The above idea would not work if the same note is issued on the same channel by the foot-controller and by the ACPad - if the notes are all destinct, then it could work, but i would prefer different channes for the different tasks )

  • edited June 2020

    @_ki said:
    I think i got now what the various components do, or ‚should‘ do:

    • Several audio layer instances with samples played by foot-controller (you didn‘t yet specify the notes send)
    • A ACPad controller sending out C2, G#2, D2, F#2, F2, D#2, D#3, C#3 to change which of the AL instances will receive the notes from the foot-controller

    But - the script you described above, would for instance send out the note C2 on ch 1,5 and 7 when C2 is received. How would this action change the foot-controller channels ?

    I could not find an input channel toggle action in AUM, only mute for the loaded plugin. But each mute would only react to a single note, so if you assign C2 of ch1 to mute the AL plugin in the AUM track, you can‘t unmute/mute it with another scene trigger note.

    It seems i am still missing a part of your idea on how it should work.

    .
    .

    Shouldn‘t it be the foot-controller notes that need to be routed through the script and the scene switch notes (also routed to the script) just change the ‚script internal routing‘ to the output channels for these foot-controller notes. When C2 is received, the foot-switch notes will be echoed to ch 1,5 and 7 (to all three of them if i got this correctly) . All AL instances would listen to this single script, but each of them only listens on a single channel. Which channel is activated for each of the trigger notes can be configured via GUI.

    For this it would be best, if the foot-controller sends its note on a different channel than the ACPad to make both types of notes easier to distinguish. (The above idea would not work if the same note is issued on the same channel by the foot-controller and by the ACPad - if the notes are all destinct, then it could work, but i would prefer different channes for the different tasks )

    The foot controllers send out notes F3, G3, A3, and B3.

    For my current AudioBus setup, I run the foot controllers into MidiFire, where I can do the sorts of MIDI channel duplication and filtering that you describe. So, really, no problem to assign different channels for foot controllers and ACPad.

    For AUM, I had ruled out the toggling of input channels for the exact reason that you mention. I had settled on using MIDI channel filters so that each AL instance would listen for a single channel and the logic of how each "scene" MIDI channel configuration could be handled elsewhere (preferably Mozaic :smile: ).

    By the way, I had thought about using MidiFire to do the matrix for each scene's MIDI channels, but it doesn't appear that I can configure MidiFire to do this sort of thing without it becoming a big plate of spaghetti with the connection routing in the GUI.

    Edit: forgot to mention that the foot controllers send on Channel 10.

    Another observation: @wim script called MIDI Channel Mute v1 is perfect for my purposes (I tested it in AUM with the foot controllers and by turning on the channels by scene, it delivers the intended result. My ideal would be to essentially configure eight of these (one for each of the ACPad notes) in one script by "selecting" each of the scenes first (corresponding to an ACPad pad) and then tapping shift to configure as shown in Wim's script. @_ki your first screen of Midi Matrix Switch would work nicely for this, except that tapping Shift would take me into Wim's channel mute configuration.

  • edited June 2020

    @_ki

    ** I AM AN IDIOT! **

    I was just reading over your MIDI Matrix Switch v10a documentation again and read that I can set each channel per "scene" to be muted. I missed that the first time but that's the key to making this work. And it works perfectly for what I'm trying to accomplish! Thanks for this excellent script.

  • _ki_ki
    edited June 2020

    @lukesleepwalker
    Oh - i‘m nearly finished with a new script to multicast midi from one to many channels... I had lots of fun figuring out a cool and clean UI that can display the selected 1 to 16 channels on a each scene’s pad, and on how to setup and configure the configurations.

    Nevertheless, i‘m going to finish and publish it over the weekend - maybe there are other use-cases :)

    .

    In ‚Midi Matrix Switch‘ you only can do many to one (merging input channels) or one to one connections - so you would need to ‚duplicate‘ the footswitch midi to all channels used by the AudioLayer instances and then use different MMS configurations which mute the unwanted channels.

  • edited June 2020

    @_ki said:
    @lukesleepwalker
    Oh - i‘m nearly finished with a new script to multicast midi from one to many channels... I had lots of fun figuring out a cool and clean UI that can display the selected 1 to 16 channels on a each scene’s pad, and on how to setup and configure the configurations.

    Nevertheless, i‘m going to finish and publish it over the weekend - maybe there are other use-cases :)

    .

    In ‚Midi Matrix Switch‘ you only can do many to one (merging input channels) or one to one connections - so you would need to ‚duplicate‘ the footswitch midi to all channels used by the AudioLayer instances and then use different MMS configurations which mute the unwanted channels.

    Oh wow, that is mighty nice of you to work on the new script! I was not expecting that. I will be happy to take a look and see if it does the job even better than the matrix switch script. I do think there are many use cases for this type of script--so many people here ask about clip launchers for AUM...

    Speaking of the matrix switch script: you are correct about the duplication of the foot switch midi to all channels. I wrote a Streambyter script to do that and it works very well. Just an extra MIDI FX plugin to manage but not a big deal.

  • _ki_ki
    edited June 2020

    @lukesleepwalker The script is published: MIDI MultiCast on patchstorage.

    • Easy setup of scenes, including auditioning of current setup, midi learn of the trigger note and double-tap to setup next scene
    • Neat display of configured output channels for each of the scenes
    • Extensive manual shown on help page
    • Two banks of 16 scenes using different color scheme. Bank switch via knob, PC msg or AU user0 parameter
    • Each input note will send its note-off to the channel combination used for the note-on. This allows sustained input notes even during scene/bank switches
    • All midi events on the input channel are multicasted
    • Many knobs support double-tap action

    Development and testing time ≈20 hours - i had lots of fun developing the output channel display and polishing the user experience during setup and usage

  • Damn @_ki that is some cute interface design on the pad channel displays! I can't want to see the cleverness in the script.

  • _ki_ki
    edited June 2020

    @wim It support to display up to 16 channels in different layouts for each count. The display uses a precompiled list for each of the pads, which is then picked up by one of the 16 layout LabelPad functions.

    I update these lists whenever a scene configuration changes (but only for the changed pad).

    The 16 channels of each scene are stored as bit-set, so its only one value for each scene. Iterating this bitset (with additional test) proved to be faster than always iterating 16 times per pad. (Did about 15 versions to find the best solution and compared the cpu when run multiple times in a timer-loop)

    I also used this script as another example for my Migration Manager Include - allowing to migrate settings between script versions (for instance when updating or developing). I also published that just today - but use it for about half a year in several of my unpublished scripts.

  • @_ki said:
    @lukesleepwalker The script is published: MIDI MultiCast on patchstorage.

    • Easy setup of scenes, including auditioning of current setup, midi learn of the trigger note and double-tap to setup next scene
    • Neat display of configured output channels for each of the scenes
    • Extensive manual shown on help page
    • Two banks of 16 scenes using different color scheme. Bank switch via knob, PC msg or AU user0 parameter
    • Each input note will send its note-off to the channel combination used for the note-on. This allows sustained input notes even during scene/bank switches
    • All midi events on the input channel are multicasted
    • Many knobs support double-tap action

    Development and testing time ≈20 hours - i had lots of fun developing the output channel display and polishing the user experience during setup and usage

    I am speechless; this is truly a marvel. The details are incredible, from the channels noted on each pad (makes am easy reference) to the import knob.... Wow. Going to put it through the paces now but first impressions? Like I said, speechless.

  • I hope that it fits your need and may be useful for other use-cases.

    .

    There are still details we didn‘t fully discuss yet, that might have lead to a totally different solution:

    • When playing the footswitches, do you just tap them or hold them to sustain them to play your loops
    • I looked up the freefrum sensors, but couldn‘t find out about which notes are issued by the two foot sensors of the set - they seem to depend on ‚tapping direction‘ of the feet. In your playing, do you use this feature or only issue one note for each foot ? You mentioned four notes in one of your postings.
    • I assumed the foot-switches send note-on and note-offs and that the samples play until the footswitch is released.
    • Does each AudioLayer instance only contain a single sample or multiple samples (maybe one for each of the four footswitch notes ?)
  • edited June 2020

    @_ki said:
    I hope that it fits your need and may be useful for other use-cases.

    .

    There are still details we didn‘t fully discuss yet, that might have lead to a totally different solution:

    • When playing the footswitches, do you just tap them or hold them to sustain them to play your loops
    • I looked up the freefrum sensors, but couldn‘t find out about which notes are issued by the two foot sensors of the set - they seem to depend on ‚tapping direction‘ of the feet. In your playing, do you use this feature or only issue one note for each foot ? You mentioned four notes in one of your postings.
    • I assumed the foot-switches send note-on and note-offs and that the samples play until the footswitch is released.
    • Does each AudioLayer instance only contain a single sample or multiple samples (maybe one for each of the four footswitch notes ?)

    There are definitely things I haven't explained yet, mostly because I recognize my use case is idiosyncratic and "edgy". I think the script that you provided could be used for many different use cases that are asked for all the time here on the AB Forums. People are always saying "why are there no AUv3 MIDI clip launchers?" This script you have created does exactly that!

    I do use the sensors for different notes that I'm playing. This short movie shows how I use the sensors to play both longer phrases (the drum part in this movie) and short notes. This is a sub-par performance as the timing is off in several places but you'll get the general idea.

    In this case the "short notes" are the atmospheric keyboard parts but in many songs I play bass lines by tapping on the same sensor in round robin mode. (I use @wim's excellent Rounder Robin script to play the short notes.) I have set up "zones" in AudioLayer for each of the four notes that I play and then I simply line up the samples across that "zone" in the order I want them played via the Rounder robin script. I like it much better than my old looping setup because it feels "loose" and musical--I can pause for emphasis and I'll often randomize layers in AudioLayer to introduce even more serendipity to the arrangements.

    Last point: I am removing note-offs with a separate script because the Freedom sensors are designed for percussion so they have very short gates for each note. Scripting out the note-offs solves this problem quite easily. I think your approach to preserving note-offs while changing scenes is quite elegant, and obviously most use cases are going to require both note ons and offs.

    I am planning full song videos to share in the near future. Your contributions with scripts are much appreciated to make these projects happen!

  • _ki_ki
    edited June 2020

    @lukesleepwalker
    If you remove the note-offs before reaching my script, then the ‚double note protection‘ „feature“ will ignore the new note-ons (until a note-off is received for that note).

    If needed, this note-off removing action needs to take place after the MIDI Multicast script

    .

    I just a a look at the video - cool application to get interactive accompaniment for your guitar play 👍🏻

  • @_ki said:
    @lukesleepwalker
    If you remove the note-offs before reaching my script, then the ‚double note protection‘ „feature“ will ignore the new note-ons (until a note-off is received for that note).

    If needed, this note-off removing action needs to take place after the MIDI Multicast script

    .

    I just a a look at the video - cool application to get interactive accompaniment for your guitar play 👍🏻

    @_ki said:
    @lukesleepwalker
    If you remove the note-offs before reaching my script, then the ‚double note protection‘ „feature“ will ignore the new note-ons (until a note-off is received for that note).

    If needed, this note-off removing action needs to take place after the MIDI Multicast script

    .

    I just a a look at the video - cool application to get interactive accompaniment for your guitar play 👍🏻

    Yes, I remove the note-offs as the last link in the MIDI chain:

    The Audeonic mfx converts the foot sensor notes to the correct “zones” in AudioLayer. The first Mozaic script is Rounder Robin. The first StreamByter script copies the CH10 data to all 16 channels. The second Mozaic script is the most excellent MIDI Multicast. And finally, the second StreamByter script strips out note-offs.

  • @lukesleepwalker said:

    The Audeonic mfx converts the foot sensor notes to the correct “zones” in AudioLayer. The first Mozaic script is Rounder Robin. The first StreamByter script copies the CH10 data to all 16 channels. The second Mozaic script is the most excellent MIDI Multicast. And finally, the second StreamByter script strips out note-offs.

    And I thought my setup was complex! Enjoyed the video and look forward to more. Very interesting.

  • @hypnopad said:

    @lukesleepwalker said:

    The Audeonic mfx converts the foot sensor notes to the correct “zones” in AudioLayer. The first Mozaic script is Rounder Robin. The first StreamByter script copies the CH10 data to all 16 channels. The second Mozaic script is the most excellent MIDI Multicast. And finally, the second StreamByter script strips out note-offs.

    And I thought my setup was complex! Enjoyed the video and look forward to more. Very interesting.

    What's funny is that I think it's actually awesomely simplified... My early experiments were these sprawling MidiFire canvases with modules snaking all over the place. I probably could simplify the current MIDI channel shown by combining some of the scripts into a single script, but I appreciate keeping things "modular" in case I need to tweak.

Sign In or Register to comment.