Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Elastic Drums: MIDI, same controller etc

I'm trying to MIDI map a scene page of my NanoKONTROL Studio to ElasticDrums.

To make life easier i've change the NanoKONTROL ccs to reflect the ccs in the manual on the knobs and buttons I want to use. But I am confused that they make no difference, they still need to be MIDI learnt.
OK fine, only a bit of extra stuff to do but then I am baffled by the Same Controller for all instruments where the MIDI mapping of instrument select is hidden, so how does one MIDI learn to it? Am I right in thinking that the strip of buttons on the right of the Instr screen are the select buttons and that they are not the same as the Trigger buttons?

Finally It seems especially tricky to map the XY controls to an XY controller! This is not just an issue for Elastic Drums but i wondered if anyone has a trick for this that isn't mapping to some other things then changing those values on the XY controller to those values.

Also I don't know if this is by design but it seems I can only talk to Elastic Drums with the NanoKONTROL Studio when its connected through AUM (haven't tried with AB3 or Modstep but I imagine it would work similarly) and not as a standalone, this despite ED recognising the NanoKONTROL Studio in the MIDI options.

Excellent app, no real complaints, so please don't think I'm criticising. I'm just confused as to what is going on with some of these things. I will get more out of Elastic Drums when I have it properly intergrated into my Nano Studio work flow.

Comments

  • I've been meaning to map a controller in the studio to ED, so I'll let you know what I find out when I do. In the meantime, I'd suggest contacting the dev, Oliver, directly. He's pretty accessible.

  • Directly here or emailing via the mail on their website "ed AT mominstruments DOT com"?

  • @Calverhall said:
    I'm trying to MIDI map a scene page of my NanoKONTROL Studio to ElasticDrums.

    I'm using a Korg NanoKONTROL (not Studio) and it works fine directly in ED. Are you trying to use Bluetooth? That could be the issue.

    I was never able to get the hidden instr0,instr1,instr2,instr3,instr4,instr5 triggers to work. As you noted, despite what it says in the manual, a lot of the parameters are not actually mapped and since those are hidden there's no way to assign them.

    The trig0,trig1,trig2,trig3,trig4,trig5 are basically the same. They will work for selecting the different instruments when "Same controller for all insert" is on so you can tweak all parameters of all instruments with just 16 knobs.

    One thing to note: if the "trigger instrument" XY pad on the Jam page is on something other than 0, you won't hear anything when you press the MIDI trigger, unless the pattern is playing - then it will trigger as per the repeats listed.

    For XY controllers, try running your finger along the edges of the pad to restrict the controller sent; otherwise remap or use something else to make the assignment.

  • Cheers for your response @aplourde. It encapsulates why I posted here rather than directly although I have done that too.

    Whilst this related thread is here: what are reactions to ElasticFX?

  • @Calverhall said:
    Cheers for your response @aplourde. It encapsulates why I posted here rather than directly although I have done that too.

    Whilst this related thread is here: what are reactions to ElasticFX?

    I had the same problem mapping x y. Someone here has suggested to first press on the edge and then tap the control you need to assign. It has worked for me on nanopad 2.

  • @Calverhall said:
    I'm trying to MIDI map a scene page of my NanoKONTROL Studio to ElasticDrums.

    To make life easier i've change the NanoKONTROL ccs to reflect the ccs in the manual on the knobs and buttons I want to use. But I am confused that they make no difference, they still need to be MIDI learnt.
    OK fine, only a bit of extra stuff to do but then I am baffled by the Same Controller for all instruments where the MIDI mapping of instrument select is hidden, so how does one MIDI learn to it? Am I right in thinking that the strip of buttons on the right of the Instr screen are the select buttons and that they are not the same as the Trigger buttons?

    I am not 100% sure, if I understand your problem, but it might be related to the "Same controller for all instr." button indeed. Have you tried to setup your device with this button deactivated? Does it help?

    The idea of this button is - when it's switched on, you only have to Midi learn one channel and you only control the parameters of an actually chosen channel. For example all kick parameters of the first channel. When you then switch to the second channel - it might be your snare or whatever - then you control all the parameters with the same knobs, you used for channel one. I hope I explained it understandable sort of ...

  • edited May 2018

    @elasticdrums said:

    @Calverhall said:
    I'm trying to MIDI map a scene page of my NanoKONTROL Studio to ElasticDrums.

    To make life easier i've change the NanoKONTROL ccs to reflect the ccs in the manual on the knobs and buttons I want to use. But I am confused that they make no difference, they still need to be MIDI learnt.
    OK fine, only a bit of extra stuff to do but then I am baffled by the Same Controller for all instruments where the MIDI mapping of instrument select is hidden, so how does one MIDI learn to it? Am I right in thinking that the strip of buttons on the right of the Instr screen are the select buttons and that they are not the same as the Trigger buttons?

    I am not 100% sure, if I understand your problem, but it might be related to the "Same controller for all instr." button indeed. Have you tried to setup your device with this button deactivated? Does it help?

    The idea of this button is - when it's switched on, you only have to Midi learn one channel and you only control the parameters of an actually chosen channel. For example all kick parameters of the first channel. When you then switch to the second channel - it might be your snare or whatever - then you control all the parameters with the same knobs, you used for channel one. I hope I explained it understandable sort of ...

    Hi ya, thanks for this reply, this was like 4 months back and I have been using the excellent Audiobus Remote controls, lately. I’ll reply properly later after work this evening because I’m not sure you have quite answered my question ...

    Until then, ED is great.

  • Hi,
    I'm here to continue the original questions/issues and can be more specific; I understand how everything works but there's still a few problems. Heed that there are some redundancies in some of my descriptions just to try and be clearer. I'm not sure if that will actually help.

    What works is: I've mapped all my midi controls to all mappable parameters; (I DON'T have 'Same controller for all instr." turned on) and that's all good but:

    1. I can't actually select the channel with midi in the way you can with the touch screen. I'd like to be able to see the instrument page when switching (currently it's just triggering, but I'd prefer to select as the manual implies with instr0-5)
      to the instrument on my controller.

    (Let's make the distinction between triggering and selecting. Currently I can trigger via midi note, but simultaneously on the jam page it selects the instruments XY pad, but the instrument page does not switch when you trigger it)

    1. I'd prefer to trigger the instruments from midi clips via Live which I can do but this interrupts the XY controls because the notes also select the XY trigs which interrupt the other trigs (swng/para/stop/freez/stutt/del). This means I can't change the X/Y for non-trig xy fx because once a note hits it switched the XY to that instrument trig.

    This leads me to conclude that midi notes triggering the instrument and instrument selection should be separate and instrument selection should also switch to the visual instrument controls. Midi notes shouldn't select the XY trigs but instrument select should, this way the notes won't constantly switch between XY trigs. This means incoming notes triggering the instrument solely trigger the instrument and don't select them which means there are no visual changes made and no XY trigs are selected. Only selecting the instrument would do those 2 things.

    Also I don't see the instr0-5 (hidden) controls anywhere though they're stated in the manual. It makes me think that they would switch to the instrument controls but those mappings aren't available.

    Also, an XY control for each jam switch thing would be more ideal so XY controls for each switch stay in place.
    I know that means that would be a total of 36 cc's instead of 2 but it'd be nice to have the XY controls stay in place when you switch to a different trig.
    I know this also means that would take more CC's than one channel could provide but if the settings are in omni mode then ED should take CC's from multiple channels truly independently, like I could map 96 of the instrument CC's on channel 9 but then the XY pad CC's could be from channel 10 and there'd be no conflict. Currently it conflicts between different midi channels.

    I have enough controls but there's a huge midi bottleneck in ED currently. I know it'd take a lot of work to rehaul but it's not at it's ideal potential. It still kicks ass which I can't thank you enough for but the limits are kind of a pita. Nonetheless still at this point, thanks :)

  • edited June 10

    Also I'm a huge fan of bidirectional feedback ability so that support is always recommended. I don't know how expensive that is but it simplifies things for the user. For example the Mutes, especially since they have momentary ability depending on the timing, if I program my leds for the mutes on my controller, there's no synchronization between the app and controller to show if the mutes are actually on or off. For one, I have to program the same momentary ability and fine tune how close the timing is for that to work but it still doesn't reflect whether the mutes are actually on or off on the controller ie if I turn the mute on in the app the leds aren't going to change on my controller which means it still says it's on when it's actually off.

    I just want to add despite these high demands that the fact each pattern holds all the information including what the instruments are is f'ing brilliant and powerful. Even between 4 patterns, that's 24 different sounds; it's really nice to be able to keep the patterns empty and trigger via clips but switch between sounds via controller (or midi clip); I love it.

  • Just wanted to bump this, but I would also suggest that maybe you tag @elasticdrums. I guess i just did, so maybe wait a couple days to give him a chance to respond, but in general it is good practice to tag the devs when asking questions. Some of them seem to know when their app is being mentioned, whether that is some kind of automatized search that they have programmed, or they just search every few days, i don't know, but that is going above and beyond imho, and I wouldn’t expect that of them.

    Apologies if this was just an over-site on your part, it’s just not uncommon for people to make posts with app questions, but no tag for the dev, as though just shouting into a void and hoping someone hears them.

    Cheers
    Thaddeus

    @shazbot said:
    Also I'm a huge fan of bidirectional feedback ability so that support is always recommended. I don't know how expensive that is but it simplifies things for the user. For example the Mutes, especially since they have momentary ability depending on the timing, if I program my leds for the mutes on my controller, there's no synchronization between the app and controller to show if the mutes are actually on or off. For one, I have to program the same momentary ability and fine tune how close the timing is for that to work but it still doesn't reflect whether the mutes are actually on or off on the controller ie if I turn the mute on in the app the leds aren't going to change on my controller which means it still says it's on when it's actually off.

    I just want to add despite these high demands that the fact each pattern holds all the information including what the instruments are is f'ing brilliant and powerful. Even between 4 patterns, that's 24 different sounds; it's really nice to be able to keep the patterns empty and trigger via clips but switch between sounds via controller (or midi clip); I love it.

  • edited July 10

    Even if the app supports midi feedback, you still have to establish that connection somewhere in the app or the host to specifically send midi back to the controller for the lights to update and what not.

    I use elastic drums with a controller and find using the same controller for functions option makes everything way easier if you set up your controller accordingly. I’ve uploaded a template and diagrams here on Audiobus forum. It’s actually a mapping to use elastic drums and elastic effects together on the same controller.

Sign In or Register to comment.