Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

how to route MPE MIDI from the iPad to PC

Hi there,

I am looking for an app (or another way) that supports “full MPE MIDI” routing for several instruments (apps with MIDI out or MIDI generators of its own) from the iPad to the PC, which means I need several MIDI Ports and also channel pressure / aftertouch messages to be passed over.

Concrete I want to use the (awesome) Touchscaper app and its MIDI out feature for playing my also MPE ready Synthesizer VSTs like Arturia Pigments 2, Continua, Serum etc. in my DAW on the PC with Touchscapers “chord and touch & glide logic” in an incredible easy, but immersive expressive way … if anyone enjoys (or wishes to 😉) to play the Roli Seaboard, but is not that piano genius (finger acrobat 😉), the Touchscaper MIDI Out feature (main instrument A + B and ribbon = 3 MPE instruments) in conjunction with your MPE ready synthesizer will blow your mind & musical heart.

The following options did not work

• StudioMux and its ability to route MIDI over USB (via the iPad connected directly into a USB slot of the PC), because for some reason they drop (filter out) the channel aftertouch messages, and hence I lose the expressive playing inside Touchscaper (moving the finger around after “touch down” 😉) – contacting their support via email and leaving a message in their touchable Pro support, did not cause any response (StudioMux last update is nearly 2 and half years ago – probably its out of production & support).

• musicIO Midi over USB supports on the PC side only 1 MIDI Port.

• Midimittr support only 1 MIDI Port

• using the IDAM (Inter Device Audio Mode / + Midi) feature of iOS <-> MacOS with the lightning USB cabled plugged into a Mac mini only creates 1 MIDI In port (also 1 MIDI Out port, which does not help) and I cannot create new ones in the Audio-Midi Setup tool on Mac OS.

• using multiple MIDI Network Sessions (rtpMIDI sessions) does not work as well, as iOS limits this to only 1 network session – you can connect more than 1 as e.g. in AudioBus3 Midi lanes, but they are merged together under the hood by iOS, see the answer of “AudioBus Michael” here : https://forum.audiob.us/discussion/39657/routing-midi-data-onto-2-or-more-midi-network-sessions-fails

As a workaround I use Photon-AU to record Touchscapers MIDI out in a MPE compatible way per instrument into a Midi file inside AUM and import these MIDI files into my DAW afterwards – but this is not really satisfying, as I cannot hear in realtime how my expressive play (moving my fingers around on the display after “touch down”) affects the sound of the synthesizer patch in the DAW of the PC.

A MIDI hardware like the iConnectivity mio2 (which gets you 10 MIDI Ports with 16 Channels each, https://www.iconnectivity.com/products/midi/mio2 and is about 70€ street price) could do the job, but I want to avoid another piece of hardware and cables … I hope there is a way to get the MIDI MPE data over to the PC via the “lightning USB cable” or via Wifi (did not find a MIDI to OpenSoundControl bridge as well … Midifire has OSC module, but only for communication to other Midifire clienst/apps) … any ideas are very much appreciated.

thanks & cheers
sounda

«1

Comments

  • Plan A . Bluetooth?
    Plan B . Also converting aftertouch to CC with a midi utility and then CC back to aftertouch for studiomux (strange it doesn’t support aftertouch)

    Why the need of multiple ports for a single MPE ?

  • @Korakios said:
    Plan A . Bluetooth?
    Plan B . Also converting aftertouch to CC with a midi utility and then CC back to aftertouch for studiomux (strange it doesn’t support aftertouch)

    Why the need of multiple ports for a single MPE ?

    It sounds like they want to use 3 separate Touchscaper MPE controller instruments to drive 3 MPE VSTs.

  • I dunno if Bluetooth can handle the bandwidth reliably enough. MPE over BT hasn’t been successful with my Sensel Morph. Only USB works reliably. Devices like the NanoKey Studio disable midi clock over BT, and that’s arguably less bandwidth than MPE.

    Maybe it’s just those two devices that can’t handle a lot of BT bandwidth. I guess a good test would be to blast some MPE over BT between two iOS devices and see how that goes.

  • @Korakios + @CracklePot
    I need multiple MIDI ports as 1 touchscaper instance and its touch instrument you play by touching the screen (the concentric circles in the middle, the strip on right side and the ribbon-strip on the bottom) is internally made out of and also externally (when you use the MIDI out feature) reflected as 3 instruments : main instrument A + B (for the concentric circles and the sidebar on the right) and the ribbon instrument (strip at the bottom).

    So with 1 touchscaper instance you get 3 MPE ready instruments (by now 😉) – you don’t have to use MPE by the way, but MPE support is what makes the playing style (touch gestures) of touchscaper so useful, when you want to play synthesizer patches in an expressive way (which also have to support MPE).

    It might sound weird to play 3 instruments (synthesizer VSTs in a DAW) with 1 MPE MIDI controller, but it is matter of layering sounds (respective synthesizer VSTs) to create an interesting & wide sound that is created by different timbres (1 instrument = 1 layer = 1 timbre).

    And with MPE and something like touchscaper you get “finger / touch control” to the sound of each note in a chord you play, which means the possibilities to create evolving sounds are “incredible”, but so easy “to do” as it happens immediately when you move your fingers together or individually (than the MPE per note feature shines through) around the surface.

    This is a such a difference to the “dry compose flow” when you want edit these “expressive data” in the MIDI editor in the DAW … by using MPE enabled touch device you feel the sound, it’s a wonderful & joyful body, emotional & mental feedback what’s happening … for myself I get totally different ideas how to alter the synthesizer patch and let the sound evolve by touch movements.

  • @sounda ,so follow Plan B

  • edited June 2020

    @sounda

    I was reading a bit about the Network Midi single port limitation.
    If I understood it correctly, I think the trick is to create multiple sessions on the PC. Each session limits you to 16 channels, but if there is more than one session, than session participants can connect to any of them. I believe you would do this in rtpMIDI on the PC.

    It seems that when you create multiple sessions on the iPad, it will merge them all to one session, if there is only one session to join on the PC. Each session on the PC will be a 16 channel port.

    I can’t test this out right now. My gear is lacking.
    But it seems a simple thing for you to try, if you haven’t tried this out yet.

    Edit- just saw your other thread. No luck, huh?

  • I also checked out how well the Roli stuff works with Bluetooth midi.
    A support thread dated March 2020 claims that Windows does not have adequate Bluetooth support to use the Roli stuff with a Windows PC over Bluetooth Midi.

    But it seems to work if you have a Mac.

  • Seaboard works well via Bluetooth on ipads and windows 10 (not LTSC version) thru an app called MidiBerry.
    https://www.microsoft.com/en-us/p/midiberry/9n39720h2m05?activetab=pivot:overviewtab
    People claim that Midiberry is unstable and disconnects ,but the trick is to leave it in the foreground and never minimize it. Maybe it had to do with a Win10 setting that disables running in the background for MSstore apps.

  • edited June 2020

    I don't have Touchscaper , but does StudioMux creates 3 different midi ports?

  • @Koraki + @wim
    just gave Bluetooth MIDI a try with my Mac mini as the host (to be sure not to miss an option that is only available on Planet Apple 😉) … but in MIDI-Setup I can only setup 1 MIDI Port, so it’s the same as I would use the IDAM features on Mac OS (plugin in the lightning cable into a USB port on the Mac).

  • @Koraki

    Plan B – to route channel-pressure via Midi CC to get it through StudioMux is an interesting idea!!

    I could think of Midifire and their StreamByter module, although it looks like programming in Assembler.

    Midiflow (https://www.midiflow.com/) advertises “transform MIDI in various ways - remap all kinds of MIDI messages to a different type or modify their values exactly as you need them” … sounds promising.

    Also remember BramBos Mosaic (http://ruismaker.com/mozaic/) – looks very pleasing, but don’t know if the script language has support for something like this … but it is Auv3 and hence it could be placed very nicely inside AUM which I use for hosting Touchscaper – so the setup would not defibrate further more 😉.

    Another challenge would be the PC side – mainly to do the “backwards transformation” in an efficient way, not introducing latency because of the tool using for this … if using StudioMux for the “USB pipe” I would have to introduce another set of virtual MIDI Ports in between (StudioMux Ports -> Transformation Ports -> DAW) where remapping from CC to channel pressure would be implemented.

    So, yes you can create several virtual Midi Port inside StudioMux on the iPad and they are automatically created & reflected on the PC side - I also checked that StudioMux routes pitch-bend and MIDI CCs.

    I will wrap my head around this and come back to this later 😉.

    In the meantime by the way I found a way to create a MIDI pipe with Lemur – no interface, just running in the background and pipe the data through – got this working with 2 MIDI Port, but I also experienced 2 crashes of Lemur already (maybe its running in the background feature is not that stable).

    This would be just another workaround, a helper, but not a real solution … I come back to this later, when I got it completed and jammed around for a while.

    Has anyone interest in the Lemur Project? If so, after some testing, I could put together a description with screenshots how to set this up – not that intuitive kind of thing, especially the Lemur Daemon port configuration – and make it available for download.

  • @CracklePot - regarding the MIDI Network Session

    yes, no luck and I did the way you described – using rtpMIDI on the PC, see the details here : https://forum.audiob.us/discussion/39657/routing-midi-data-onto-2-or-more-midi-network-sessions-fails.

    And I trust “Audiobus Michael” with what he is saying – I think he will have had some hours of debugging this … and by the way AudioBus3 is the only App I have seen so far, that picks up the Bonjour Names of all rtpMIDI Sessions the iPad is connected to, but its iOS that rules underneath and does its thing, a use would not expect 😉.

  • edited June 2020

    I think streambyter code would be something like
    DX = BX 09
    aftertouch (channel pressure) to CC 09

    Edit:
    For the PC you only need a free VST midi remapper before your VST instrument . Maybe pizmidi plugins still work

  • @Korakios
    regarding the PC side it’s a little bit trickier, because of MPE and the DAW I use for this, which is Bitwig Studio – it has a concept of (MIDI) “note event expressions”, which they make available in a separate area inside / underneath the MIDI editor in which you can edit the MPE specific data (like channel pressure) on a per note basis – select a note in the midi clip and you see only the MPE data that belong to that specific note highlighted and can manipulate them as you like, without losing context and still seeing the values of the other notes in the chord, or in the “neighborhood”.

    But this has a drawback in such that I cannot record the “temporary MIDI CCs” into Bitwig, as they would not show up & act as “note event expressions” … and I want to have the possibility to edit my Touchscaper / MPE Controller Performance afterwards.

    There is a Controller API for Bitwig, which lets one develop custom MIDI Controller templates, which are used e.g. for routing MIDI keyboards or virtual ports inside and data from Bitwig to the outside world. This guy (http://www.mossgrabers.de/Software/Bitwig/Bitwig.html) has done a lot of templates and on his YouTube channel he offers insights for “Developing with the Bitwig Controller API”.

    With these custom (MIDI) Controller templates you can route “various incoming data” inside the DAW and control the DAW in various aspects (transport, new tracks, etc.) … so this another, but for now a more theoretical, option for the backwards transformation (CC to channel-pressure / aftertouch) – the time to invest would only make sense if this would open up further musical or expressive features that one cannot achieve with existing tools.

    +++

    Nevertheless your suggestion using a “VST midi plugin” for transforming MIDI CC back to channel-aftertouch in front of the VST in the device chain of the DAW would work great for live performances and during sound design, tweaking the synthesizer patch.

    I found a MIDI tool for routing & mapping on Windows named MIDI-OX (http://www.midiox.com/) – long time ago it was last updated (17.06.2010), but I found posts where people stated they got it running under Windows 10 64 bit … I will give it a try (using loopMIDI http://www.tobias-erichsen.de/software/loopmidi.html to create the in-between virtual ports) and report my experience.

  • I wonder if you could just use Bluetooth, Network, and USB all at once and get 3 ports that way?

    Seems like it might be a fire hazard, though.
    Best to use caution.
    😉

  • edited June 2020

    I don't have Bitwig installed ,but you can crate a new instrument (where you place your MPE synth) and set the input from the midi vst of the other track.
    This is useful to avoid virtual midi port routing.

    PizMidi plugins are no longer available (hope the dev is ok) but they are hosted on
    https://code.google.com/archive/p/pizmidi/downloads

    just avoid the pizjuce, stick on the pizmidi 64 latest . For windows it's
    https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/pizmidi/pizmidi_x64_20111013.zip

  • @Korakios … first setup the “plan B way” is up & working 😉

    this still could be optimized, but I wanted to see how this MIDI routing stuff is going on the PC and what latency it incorporates – mostly, because looking at Ruismaker Mozaic triggered some ideas (creating some MIDI fancy on the PC with easy and not so time consuming development tools, like “pure data” or Max DSP as it works data flow oriented and got all the MIDI tools ready to use) and I was curious how an antique MIDI flagship like MIDI-OX works and “musically thinks”.

    I cannot tell about the latency from a performance perspective, as I just verified that the data are coming through & converted correctly with a MIDI monitor – Protokol by the touchOSc developer by the way : https://hexler.net/products/protokol
    … and wanted to share the setup, as others might be interested.

    I choose mfxConvert (https://apps.apple.com/de/app/mfxconvert/id1451192046) for the “channel pressure to MIDI CC” conversion as it’s AUv3 and I want to integrate as much as possible via or directly inside AUM – the UI of mfxConvert is unbelievable, but it does the job.

    Just looking at their hex-code operations, I thought mfxConvert might do the conversions in an really efficient manner: directly operating on the MIDI byte stream and just flipping the bytes – not the event handler way like Lemur or Rusimaker Mozaic does, which might be efficient as well, but even in their manuals they don’t reveal how their “in app script” is working – will it be compiled (fast) or interpreted at runtime (slow).

    The second “host kind of thing” in the iPad setup is StudioMux (as I want to use their “iPad lightning cable to PC USB Port” communication pipeline) – inside the StudioMix iPad app I create 3 virtual MIDI Ports (studioTsMainA, studioTsMainB, studioTsRibbon) which are reflected on the PC side automatically, given that StudioMux server is installed.

    In AUM (primary host on the iPad) I run Touchscaper in an audio lane and setup 3 different MIDI lanes, each for one of the 3 Touchscaper MPE MIDI out instrument ports and in AUM’s MIDI routing matrix I route them to the StudioMux virtual MIDI Ports and hence over to the PC, so its end up with
    • Touchscaper A MIDI Out -> mfxConvert -> studioTsMainA
    • Touchscaper B MIDI Out -> mfxConvert -> studioTsMainB
    • Touchscaper Ribbon MIDI Out -> mfxConvert -> studioTsRibbon

    On the PC I run MIDI-OX, which runs without problem for the time I set this all up and tested it. I created 3 additional virtual MIDI Port with loopMIDI (pcTsMainA, pcTsMainB, pcTsRibbon) that will be fed into the DAW and that were monitored by Protokol in order to see that everything is correct. So, backwards conversion “MIDI CC -> channel pressure” works this way
    • studioTsMainA -> MIDI-OX -> pcTsMainA
    • studioTsMainB -> MIDI-OX -> pcTsMainB
    • studioTsRibbon -> MIDI-OX -> pcTsRibbon

    This word only description is not that “lighten the dark kind of thing”, but if I got the final thing going, I will at least take some screenshots to make it more obvious – for the time being I hope it helps & inspire nevertheless.

    (@Micheal ;-) In the end I believe this “entire MPE thing” needs a robust solution and the more I fiddle around the more it gets clearer that this a job & future feature for AudioBus3 or even AUM … in the end its about extending the Audio & MIDI lanes of the iPad over to the PC and Mac in an efficient & elaborated way, which means 8 Audio channels & “a lot of” MIDI Ports 😉.

    In my opinion this features are natural to AudioBus3 and AUM (and apeMatrix as well) and as “huge as they are” they should have the developer resources to get that server side software implemented and maintained (it's a pitty that StudioMux seems to be out of production, but it’s the way it is – and their VSTs for receiving audio lanes are not that stable as well, sometimes they lose their connections and one has to open their UI to get them up and running again … but the feature is awesome !).

  • @CracklePot
    I think it would work – at least on mac with enough distance to the fire 😉 … but sadly my Mac mini is way too low on resources to handle my composing & making music software (VSTs, analogue emulations and sample-based Instruments have become joyful powerful, but they need CPU & RAM to get it rocking 😉).

    So my “music machine” is a Windows 10 one … and Bluetooth on Windows is story on its own … or in other words : if you got 1 manufacturer like Apple that “has control” and that carefully (at least with thoughts towards the future) picks hardware components and supports not only the diver on operating system level, but also the firmware side of the radio-chips, you have a chance to get something you can work with.

    On planet Windows it’s like rolling the dice getting the right Bluetooth Adapter, a mission of hope that bug & security issues in the firmware of the radio-chips are fixed, several times a year a surprise event when Windows updates “roll out / over” your machine : will it still work or not … this Bluetooth thing is too cheap, so that on planet Windows, no one really seems to care about it … sad but true 😉.

    +++

    I think the solution must be seen & solved from a more abstract & general perspective (beyond the MPE MIDI thing I started this discussion with), keeping the workflow of the users in sight & center.

    As mentioned yesterday in my last post, this “full fledge MIDI Routing” over USB (iPad lightning cable to PC / Mac USB Port) should be a feature inside the “Audio & MIDI host” like AudioBus and AUM – it’s a natural extension to their core app purpose and workflow.

    I love to compose and create music on the iPad alone … but I am getting more and more towards using the iPad as an incredible powerful & soundful extension to my PC DAW setup – and using it in both directions.

    As Touchscaper shows e.g. iPad apps could be the master as well – delivering features and compose-flows that are just not possible (or existing) on the PC / Mac.

    And looking & using AudioBus3 and AUM is the perfect visualization & workflow for doing so (in comparison to a any DAW on the iPad) … you can create several Audio & MIDI lanes and plugin whatever sound generator or effect you like – even creative MIDI Sequencer like Fuge Machine, Gestrument Pro, Poly 2, Aphelian etc.

    The only thing that is missing is a rock-solid & powerful Audio & MIDI bridge over “lightning cable to USB Port” … bidirectional at least for MIDI.

    Well, StudioMux claims to do that, maybe it did that in the past, but now it’s seems to be not supported any more – missing MIDI messages and the VSTs in the DAW on the PC drop their audio connection sometimes, so that you have to open the device to get it connected to the iPad again. I remember even having issues with the VSTs in Ableton Live, while in Bitwig I did not.
    I do not want to blame the developers for that – StudioMux is not that expensive and it works in some use cases, so its okay … BUT THE FEATURES ARE NEEDED 😉.

    You can look and investigate that from another perspective as well to understand that this “iPad lightning to PC/Mac USB” should & could work – an iPad & PC/Mac setup with iConnectivity Hardware
    • Mio2 as a MIDI router : https://www.iconnectivity.com/products/midi/mio2
    • Audio2+ as a Audio router : https://www.iconnectivity.com/products/audio/iconnectaudio2plus

    You could use an Apple Camera-Connection-Kit together with a powered USB Hub to connect both hardware boxes with the iPad via USB … and also both ones via USB with the PC or Mac … and yes only USB, no audio or midi cables needed.

    And if you look into the feature set of Audio2+, you discover that you can route 8 audio channels between the 2 USB connected devices – you don’t see it on their website, I saw it in a hands-on video in YouTube which showed the routing matrix (at least if they still support that feature 😉) ... they call this AudioThru.

    USB has a lot of bandwidth, as you might remember that a lot of audio-interfaces that have several line-ins and sampling frequencies of 96 kHz and beyond – all over 1 USB 2.0 connection … and the new iPad Pro has USB C / Thunderbolt 3 which should make even bidirectional multi audio-channel possible from bandwidth perspective.

    But I am not an iOS app developer nor do I know “how much” an app developer or “server”-software (the hook beyond the USB connection / port, which often runs as a service or daemon and hence in the tray bars of Windows and Mac OS) developer on the PC/Mac side could get access to the data-streams & bandwidth of the “USB port - diver end”, which will be under the “good will” of the Apple USB driver I think 😉.

    … but StudioMux has proven that it is possible – maybe it’s just the time to make a fresh start and build on technologies that are available now that were not as StudioMux started.

    I don’t know how AUv3 is conceptualized on the iPad – maybe this could be done in a modular approach (as a single AUv3 plugin can already have multiple outputs as EG Pulse has implemented it e.g.), meaning that an independent app developer could realize an “iPad AUv3 output hook -> lighting to PC / Mac USB -> MIDI Pots & VST / AU Plugins on the PC / Mac” bridge that can be plugged into AudioBus3 and AUM (apeMatrix as well ?).

    Hope this inspires … and let something creative come to live & into reality 😉

  • @Korakios

    I remembered another MIDI conversion/mapping VST plugin as I own their “AUM alike product” Unify, which is MPE compatible by the way … its ModMate, which you could download for free here https://www.pluginguru.com/products/modmate/ or here https://www.getdunne.net/download/modmate/.

    The latter seems more useful, as it’s the developer site and you got the release history – and this thing is open-source on GitHub https://github.com/pluginguru/ModMate.

    From the UI & feature perspective it fits perfect – it maps CC cross wise & in a “fan out” manner for 1 to 4 CC and it supports channel-pressure as well.

    Problem though: the MIDI Insert plugin version VST2 64-bit creates a virus alert and the VST3 64 bit is buggy. Regarding the virus alert – do you think this maybe roots from the “Juce envelope”, as you said I should avoid that for the “piz midi” tool – or is that pizjuce another juce?

    The VST Instrument plugin version VST2 64-bit though, seems to work, BUT it does not help within Bitwig as the via track routing recorded and successfully to channel-pressure transformed MIDI CC are not picked up in the “note event expression” lanes inside Bitwig.

    That is not a bug of the ModMate plugin, but a workflow MIDI flow restriction of Bitwig itself … the MPE relevant Data have to come in through a MIDI Controller (a configured MIDI device) and the code of it has to switch on this special MPE feature, so that an internal parser pushes these MPE relevant data on their “note event expression” lanes.
    Nevertheless, ModMate might work in other MPE ready DAWs – slick and easy to use.

    +++

    Regarding Bitwig, I will now implement a MIDI Controller template for the Touchscaper MIDI out feature … whereby I even will not have to make the MIDI CC to channel-pressure conversion, as I will use another Bitwig MPE expression named “timbre”, which is by default mapped to MIDI CC74 (instead of “pressure” which is mapped to channel-pressure).

    Maybe I could use the “Roli Seaboard” MIDI Controller template for that (as they use CC74 as well), but I want to get an insight how easy or not this MIDI Controller templates for Bitwig can be developed, as this doorway (2-ways by the way) might be very helpful for other creative setups.

    I come back when I got it rocking 😉 … and will upload it, so anyone who is interested can use it – and then it’s also time for some screenshots to make this entire work- & data-flow more clear and reproducible.

  • I meant avoid juce version as it seems it requires some additional stuff . The link I posted is for the pizmidi utilities
    https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/pizmidi/pizmidi_x64_20111013.zip

    As for BitWig I can't help you since I don't have it installed. But did you do the following?

    • Create an instrument track. Set midi input from "pcTsMainA" , add ModMate midi fx .
    • Create second instrument track. Add your MPE instrument . Set input from "ModMate" from previous track.
  • @Korakios

    I did exactly what you describe – and Bitwig has several ways to route MIDI by the way, which is very neat.
    But the challenge is, to get the transformation (CC20 -> channel-pressure) done in way so that I can use the “special MPE editing” features, when I record the “resulting MIDI” inside Bitwig.

    Hope this Screenshots make it clearer what I am all talking about 😉 … the first one is based on 1 single note touch & vertical movement in Touchscaper recorded with channel-pressure converted to CC20 with mfxConvert.

    The following happens if I convert CC20 back to channel-pressure in Bitwig via ModMate, which works on Track “MIDI Player” and the Track “MIDI Intermediate” receives and records its output. The conversion is done correctly but its recorded in the normal MIDI CC lane – no MPE editing features there.

    What I wanted to have is the following, which I edited manually and which shows the “note event expression” editing feature, which is located underneath the piano-roll – I think Bitwig calls this “note inspector”.

    And this seems to be only achievable when the MPE specific data flow into Bitwig from the outside world.

    But that’s incredible easy to achieve with a custom MIDI Controller (existing Templates for MPE hardware should work as well as long as they use the same channel messages) – its literally 1 line of code :

    noteInput.setUseExpressiveMidi(true, 0, 48);
    … true = triggers that MPE specific data are routed to the “note event expressions” lanes
    … 0 = MIDI Channel 1 will be the master channel, and hence notes and their channel messages are only expected on MIDI channel 2 to 16
    … 48 = range of 48 semitones for pitch bend

    and the entire scripts has in its essence 6 function calls – that’s it … no conversion done though, which means that I changed the conversion from CC20 to CC74 on the iPad and hence they are recognized as the “timbre” expression.

    The result of this is the following, which shows the power of “note event expression” editing – in Touchscaper I played a 3 note chord and moved my 3 fingers vertical, but each one differently (one might call it a “left curve gesture”).

    If you select one of the chord notes (in the screenshots it's G4), you get only its timbre (CC74) values selected (the dark gray line) and its very easy to “after-edit” your touch-performance, which is by the way not only theoretically, because if you change the synthesizer patch after the performance for whatever creative reason & musical joy, you might to have adjust the “MPE performance” data manually.

    Also I think this screenshot makes it more clear how great that MPE feature is, if Synthesizer support it … you get an very easy to use (if you got an appropriate MPE editor at your disposal 😉) and very pleasant way for sound design, especially for long playing chords – as you can introduce evolving change of the timbre in an also visual recognizable way … letting a note / a frequency of a chord dive down and rise up again, while the other notes undergo other timbral movement … this is dream for all that chill-out and ambient alike music come true 😉.

    I will be back with another post where I upload the Bitwig MIDI Controller Script and explain the overall setup from the iPad to the PC into DAW with more screenshots … I think this should be adaptable to Cubase and other MPE compatible DAWs to some degree as well (but that must be done by others, because I only got Ableton & Bitwig at my disposal).

  • Weird, will download BitWig again...
    So why not making the transformation on MidiOX since you use it for mapping the midi ports?
    In Option/Data Mapping make the routing and save it for future sessions

  • @Korakios
    With using CC74, which is recognized as the timbre MPE expression, instead of CC20 I just don’t need MIDI-OX any more – also I don’t need another set of virtual MIDI Ports (created by loopMIDI so far), but use the StudioMux ones that are created in the iPad app and reflected by their service / server software, whose UI you is accessible via the tray bar.

    I just need to use a MIDI Controller Script inside Bitwig, that support the proper routing of the MPE Data – which should also work with the Roli templates, but more on that in the next post.

    I developed a custom MIDI Controller Script, because the knowledge what is possible & how to do it, might lead to further creative options … I’ve you watch Moss’s video regarding OSC and open-stage-control you might understand why : youtu.be/u-fgArpsHQQ.

    This open-stage-control seems to be very powerful (Moss’s states its possibilities are beyond what touchOSC or Lemur can do for Bitwig), and you can do JavaScript & CSS coding in there (on a user level), it’s open-source and is on the way to reach version 1.0 : https://openstagecontrol.ammd.net/.

    To use open-stage-control it might be sufficient to use Moss’s controller scrip / extension … but I got something more in mind … hard to tell by now – it has to settle clearly in my mind 😉.

  • @Korakios @CracklePot @wim

    This one is about the entire setup from Touchscaper (as one example a of an app that has a MPE MIDI out feature – but is also awesome on its own for song creation & sketching … that’s for another post 😉) inside AUM over StudioMux into the PC and Bitwig as the DAW.

    1. StudioMux – PC side & iPad

    It makes sense to start with this one and even on the PC side – right click on the tray icon of StudioMux in the tray bar and click quit. The reason for this: it often does not pickup the changes of the virtual MIDI port configurations (especially the ones that do not exist anymore on the iPad side).

    On the iPad open the StudioMux app and create yourself as much virtual MIDI Ports as you need for transport “temporarily converted MIDI data”, which are 3 for Touchscaper (main instrument A + B and the ribbon instrument) – you don’t need new ports for those MIDI Ports whose data are plain (non MPE) MIDI, like the sequencer synths, percussion and drum tracks of Touchscaper, because StudioMux can route them through (you don’t see them in the following screenshot, because Touchscaper nor any other app is started yet).

    2. AUM

    Create an Audio track and host Touchscaper in IAA mode and start it by tapping its icon.

    3. Touchscaper

    Compose yourself a piece of music and enjoy yourself … and come back after some hours relaxed and full of joy 😉. … or just grab one of the example scenes, to get the setup up & running.

    Activate MIDI out in the settings and advanced tab and configure MIDI out for each of the instruments (touch & sequencer instruments) in the MIDI connections dialog.

    I would also recommend to activate Ableton Link in the advanced settings, because this makes recording the session inside the DAW easier.

    I route MIDI out to AUM as I will do the channel-pressure (which is the MIDI message sent out for the finger movements after touch down) conversion to MIDI CC74 in AUM via AUv3 plugins. You also might to choose to switch the loudspeaker icons on again, so you here Touchscapers sounds as well as getting their MPE MIDI data – which makes it easier to hear & see in the DAW that everything works as expected.

    4. AUM

    Create 3 MIDI lanes, and in each of these place a plugin for MIDI message conversion – I use mfxConvert, but others should work as well.

    Setup the MIDI conversion : from channel pressure (aftertouch) to MIDI CC74 (this is DAW specific – I use CC74 in order to route that data on the Bitwig specific “note event expression” called timbre.

    Also notice that I have 16 conversions inside mfxConvert – which is actually not needed, as with MPE you have 1 master channel and 15 note channels, hence only 15 conversions are needed and in the case of Touchscaper only 8 as it is 8 note polyphonic and uses channels 2 - 9 for that.

    But I have created myself an all channel preset for mfxConvert, because the UI is so puhhhhhhh 😉 – and its easier to strip things down as to create new conversions.

    Setup the MIDI routing from Touchscaper to the mfxConfert plugins in the main screen … so we got Touchscaper MIDI out -> AUM | mfxConvert MIDI in.

    Next up switch to the routing screen in AUM and setup the MIDI routing towards StudioMux … so we got mfxConvert MIDI out -> StudioMux virtual MIDI Ports in

    5. StudioMux

    For the sake of simplicity and easy of use on the PC side, it makes sense to deactivate the virtual MIDI out Ports of Touchscaper whose data we are converting by mfxConvert inside StudioMux – so they don’t show up as virtual MIDI Ports on the PC side.

    This is done by tapping once & short on the connection symbol (tow circles and the dash) in the left column with the header source – afterwards they appear in a darker gray.

    … plugin your “iPad lightning cable to USB Port” and head over to the PC 😉

    6. StudioMux – PC side

    Start the StudioMux service (or server software like they call it) again – if you have used the default installation locations, you find it here : C:\Program Files\Zerodebug\studiomux\studiomux.exe.

    If you have a MIDI monitor application like Protokol (https://hexler.net/products/protokol), it makes sense to start it verify that everything is setup correctly and works as expected – especially that all virtual Ports are created, the MIDI CC conversions are setup correctly and come through.

    7. Bitwig – Installation of the custom MIDI Controller Script

    You don’t have to use my one and could use the “Roli Seaboard Rise” MIDI controller template as well, but it forces you to configure a MIDI out port, which is not needed at all.

    In the custom one I have created (incredible easy & just 5 function calls), I have placed some comments so one can understand what is going on.

    For further reference you could watch this playlist of the “Bitwig MIDI Controller script / Extension Guru” – if one has some basic programming knowledge you will be served : www.youtube.com/playlist?list=PLqRWeSPiYQ66KBGONBenPv1O3luQCFQR2

    For short readings and the JavaScript way only (Java is also possible) here are 4 blog post : https://www.keithmcmillen.com/blog/controller-scripting-in-bitwig-studio-part-1/ (..part-2, ..part-3, ..part-4).

    To use the custom MIDI controller script (Touchscaper MPE.control.js) I attached to the post, it’s just about finding the appropriate folder and copy the file there.

    ATTENTION : delete the ".txt" file extension that I had to append (so it ends with .js - which stands for JavaScript), in order be able to upload the file.

    By default the path is: C:\Users\YOUR_WINDOWS_USER_NAME\Documents\Bitwig Studio\Controller Scripts.

    You can see and open the path also via the Bitwig Dashboard (click the Bitwig icon in the center of its window header to open it): Settings -> Locations -> My Controller Scripts.

    8. Bitwig – its showtime 😉

    First step : create a MIDI controller inside the Bitwig Dashboard … Settings -> Controllers -> scroll down the list on the right hand side and click “+ Add Controller”.

    In the combobox “Hardware Vendor” choose “iOS” and the script appears in the “Product” list as “Touchscaper MPE” – click “Add”.

    By the way you could easily change the “Hardware Vendor” and “Product” to whatever you like - rename the file and the names in the function call host.defineController accordingly (read that GUID / UUID stuff in the comment of the code as well) … its all plain text, so you can use notepad and you are good to go (no compile or other toolchain needed – that’s all handled by Bitwig).

    Next up is to choose the right MIDI IN port … which will be the one you created in StudioMux on the iPad and are fed by mfxConvert in my setup … which will appear on the PC side with a postfix “@ YOUR_IPAD_NAME”.

    To finetune, double click the header “Touchscaper MPE” and you can rename this instance … which makes sense in the case of Touchscaper, as you will setup 3 of these.

    Next up is to create 3 instrument tracks (MIDI tracks) and choose the appropriate MIDI controller.

    Also be aware to set the pitch-bend correctly – I setup a pitch-bend rang of 48 semitones in the script (you can alter this by simple editing the text & save, Bitwig will pick up the changes automatically), hence I enable “PB -> expression” in the track properties and configure it from -24 down and 24 up.

    This pitch-bend feature is used by Touchscaper at least for the ribbon instrument.

    … and now all is setup and ready to jam & record … enable Ableton Link, arm each track and recording in Bitwig & start the playback, which will trigger the arrangement & sequencer in Touchscaper (the chord progression is rollin 😉) … head over to the iPad and give yourself a joyful jam which is then recorded in Bitwig with “full MPE” expressive richness 😉.

    Most easy way to see if everything works as expected is use the hold & orbit features of Touchscaper before you start playback & record, as I did for the following screenshot: 3 finger chord + hold + orbit (the 2 buttons at the bottom right) and then start playback and hence recording … then you are free to watch what happens / is recorded in the DAW and in my case, I also played a little bit the ribbon.

    The following screenshot shows the recording and “note event expression” timbre of the main instrument A and the awesome per note editing feature of it inside Bitwig – I selected the note A3 in the piano roll and inside the “note inspector – timbre lane” I get that MPE data selected (the dark gray line and orange dots) for just that note and can easily edit them in the musical context.

    In the next screenshot you see the pitch-bend feature in action, which is used by the ribbon instrument of Touchscaper – the gray lines in the piano roll are their visualisations

    And if you activate the micro pitch expression editing feature (by clicking the fork symbol), you can select a note and see it’s pitch-bend curve and edit its points inside the piano roll and in the context of the notes it flows towards.

    SOOOOO … that’s it … hope it helps and inspires also some iOS app developers to build a full fledge “audio & midi bridge” from iOS to PC / Mac, incorporating & using the “iPad lightning to PC/Mac USB Port” feature … this wire could do so much and extend the creativity & possibilities of the PC & Mac DAW’ers in a immersive and massive way.

    And as side note – if something does not work anymore, that has some minutes before – close (quit) the StudioMux service in the tray bar and start it again and if you put your PC to rest in energy saving state for a while and woke it up again, restart Bitwig as well … don’t know if this strange behavior is rooted / caused the by virtual MIDI Port pipeline / subsystem of Windows, but chances are 😉.

  • That was some helluva post :-) thanks for the writing, will try this !

  • Wow, what a nightmare. I just wanna stick the plug in the hole and jam. Why can’t I do that instead?

  • wimwim
    edited March 2021

    @Wrlds2ndBstGeoshredr said:
    Wow, what a nightmare. I just wanna stick the plug in the hole and jam. Why can’t I do that instead?

    You can with a Mac.
    It's just a lot more difficult with a PC.

  • @wim said:

    @Wrlds2ndBstGeoshredr said:
    Wow, what a nightmare. I just wanna stick the plug in the hole and jam. Why can’t I do that instead?

    You can with a Mac.
    It's just a lot more difficult with a PC.

    I find it easier to set up on PC than mac using RTP MIDI

  • Hi,
    Checking on this thread to find the best solution to transmit midi (including MPE) data from iPad to WIndows 10 PC.
    Is it using Studiomux (which is undergoing some changes) with an ethernet cable?

    It will be great if someone can summarize the best (wired) solution available that worked perfectly for him/her.

    BTW here is my need: I am planning to use Velocity Keyboard and/or Geoshred app in iPad (still in ios 14.8) to play vst instruments in my Windows 10 laptop

Sign In or Register to comment.