Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

New Midi app MusiKraken

«134567

Comments

  • has anyone tried

  • I'd be interested to hear if anyone has used this with iOS (say, with AUM). Presumably no AU?

  • It’s on sale for $6.99 instead of $9.99

  • @Poppadocrock said:
    It’s on sale for $6.99 instead of $9.99

    yes but no idea if it works with Ableton11
    tried to contact the dev but couldnt find an email address on the website

  • Hello all, I am the developer of MusiKraken. (Sorry for the delay, I needed to get access to this forum first :-) ). Thank you for telling me that I forgot to add an email address to my website! There is one now...

    Some info:
    Currently the MIDI events generated in the app can be sent to anything that I can access using CoreMIDI, and to network MIDI (using the CoreMIDI implementation and my own implementation of RTP-MIDI). So if another app supports CoreMIDI, you can connect to it. I tried connecting to multiple apps that use background audio (like AudioKit Synth One), and splitting the notes (using the Chord Splitter) and sending them to different apps seems to already work as well.

    But I myself haven't tried that many combinations yet, simply because I so far only use the app to control external virtual instruments on my computer. So I am quite new to the area of making music directly on my iOS device, but I will definitely update my app to support it better!

    There is no direct Audiobus or AUv3 implementation yet, but I moved it to the top of my huge TODO list and will work on it right now. Is there anything else that you use on iOS instead of CoreMIDI, Audiobus and AUv3 that I should add? The fun thing about MusiKraken (at least for me, as a developer :-)) is that everything is completely modular, so as long as I do not break already existing modules, I can add whatever feature I want to the app...

    And yes, it should work with Ableton 11 (I only have Ableton 10 yet, but no problems there). It should work with anything on your Mac or Windows computer (haven't tried Linux yet) that supports MIDI, because it simply sends MIDI 1.0 events so far. I never tried to send MIDI events from an iOS device via Lightning cable to a Windows machine (this works on a Mac), not sure if that combination works. So if you use Windows, the easiest way to connect is to use Tobias Erichsens rtpMIDI there and connect via WiFi.

    The way the app works is that you use the various input modules, like the keyboard, chord pad, TrueDepth camera, Face Tracking, Hand Tracking, microphone and accelerometer to generate MIDI events, route them through various effects (to split the chords into separate notes and send each to a separate instrument, for example) and send the events to an output port.

    I am currently trying to create better descriptions on what you can do with the app (there is already quite a lot that you can do), if you have any questions or if a feature is missing, please tell me!

    Thanks!

  • edited December 2020

    @Snarp said:
    Hello all, I am the developer of MusiKraken. (Sorry for the delay, I needed to get access to this forum first :-) ). Thank you for telling me that I forgot to add an email address to my website! There is one now...

    Some info:
    Currently the MIDI events generated in the app can be sent to anything that I can access using CoreMIDI, and to network MIDI (using the CoreMIDI implementation and my own implementation of RTP-MIDI). So if another app supports CoreMIDI, you can connect to it. I tried connecting to multiple apps that use background audio (like AudioKit Synth One), and splitting the notes (using the Chord Splitter) and sending them to different apps seems to already work as well.

    But I myself haven't tried that many combinations yet, simply because I so far only use the app to control external virtual instruments on my computer. So I am quite new to the area of making music directly on my iOS device, but I will definitely update my app to support it better!

    There is no direct Audiobus or AUv3 implementation yet, but I moved it to the top of my huge TODO list and will work on it right now. Is there anything else that you use on iOS instead of CoreMIDI, Audiobus and AUv3 that I should add? The fun thing about MusiKraken (at least for me, as a developer :-)) is that everything is completely modular, so as long as I do not break already existing modules, I can add whatever feature I want to the app...

    And yes, it should work with Ableton 11 (I only have Ableton 10 yet, but no problems there). It should work with anything on your Mac or Windows computer (haven't tried Linux yet) that supports MIDI, because it simply sends MIDI 1.0 events so far. I never tried to send MIDI events from an iOS device via Lightning cable to a Windows machine (this works on a Mac), not sure if that combination works. So if you use Windows, the easiest way to connect is to use Tobias Erichsens rtpMIDI there and connect via WiFi.

    The way the app works is that you use the various input modules, like the keyboard, chord pad, TrueDepth camera, Face Tracking, Hand Tracking, microphone and accelerometer to generate MIDI events, route them through various effects (to split the chords into separate notes and send each to a separate instrument, for example) and send the events to an output port.

    I am currently trying to create better descriptions on what you can do with the app (there is already quite a lot that you can do), if you have any questions or if a feature is missing, please tell me!

    Thanks!

    welcome to the forum! :blush:
    thank you for the summing-up! I just ve one more question
    hand tracking works almost on all ios devices but hand tracking combined w truedepth module requires iphonex/+, right?
    and without truedepth module, you can only add modulation. so, truedepth makes it possible to convert movements into midi notes, as I understand. is it right :no_mouth:

  • Hi @cazel : Currently you need a device with TrueDepth capabilities for all camera features in the app (also for Hand Tracking) because ARKit (used for Face Tracking) also needs a front-facing TrueDepth camera to work. For Hand Tracking I currently simply use the camera image from ARKit, because mixing multiple camera instances wouldn't work. And Hand Tracking only works on iOS 14+. But yes, I should create a fallback for devices that do not have a TrueDepth front camera and support iOS 14 (I don't think I even have a testing device with this combination, but I will try to simulate this somehow...).

    I should add another thing about how MusiKraken works (I should create a video about ithis...): In the editor of the app, there are orange ports (MIDI) and green ports (numerical values, like the depth value from the TrueDepth camera, the hand x- and y- positions of Hand Tracking, sliding on the keyboard key, the microphone amplitude, accelerometer values etc.). If you connect a green output port to an orange MIDI input port, you can select how the values should be converted to a MIDI event. This way every numerical value can be converted to either MIDI notes, MIDI Control Change events, MIDI Pitch Bend or MIDI Channel Pressure values.

    Which makes it totally flexible, you could for example create a weird MIDI controller that converts the loudness of the microphone input to MIDI notes and plays higher notes if you whistle more loudly, or control one instrument by pushing the keys on the musical keyboard, and control another instrument by sliding on the keys. Obviously, not all combinations make much sense, but I already discovered quite useful combinations that I didn't think of previously...

    So yes, once I support Hand Tracking on devices without a TrueDepth camera, you could use your hand to generate notes as well. (Maybe I should also look into the new Body Pose tracking in iOS, then you could make music by jumping around in front of the camera...? I "just" need to find a good way to translate the body movements to MIDI events. :smiley: ).

  • @Snarp said:
    Hi @cazel : Currently you need a device with TrueDepth capabilities for all camera features in the app (also for Hand Tracking) because ARKit (used for Face Tracking) also needs a front-facing TrueDepth camera to work. For Hand Tracking I currently simply use the camera image from ARKit, because mixing multiple camera instances wouldn't work. And Hand Tracking only works on iOS 14+. But yes, I should create a fallback for devices that do not have a TrueDepth front camera and support iOS 14 (I don't think I even have a testing device with this combination, but I will try to simulate this somehow...).

    I should add another thing about how MusiKraken works (I should create a video about ithis...): In the editor of the app, there are orange ports (MIDI) and green ports (numerical values, like the depth value from the TrueDepth camera, the hand x- and y- positions of Hand Tracking, sliding on the keyboard key, the microphone amplitude, accelerometer values etc.). If you connect a green output port to an orange MIDI input port, you can select how the values should be converted to a MIDI event. This way every numerical value can be converted to either MIDI notes, MIDI Control Change events, MIDI Pitch Bend or MIDI Channel Pressure values.

    Which makes it totally flexible, you could for example create a weird MIDI controller that converts the loudness of the microphone input to MIDI notes and plays higher notes if you whistle more loudly, or control one instrument by pushing the keys on the musical keyboard, and control another instrument by sliding on the keys. Obviously, not all combinations make much sense, but I already discovered quite useful combinations that I didn't think of previously...

    So yes, once I support Hand Tracking on devices without a TrueDepth camera, you could use your hand to generate notes as well. (Maybe I should also look into the new Body Pose tracking in iOS, then you could make music by jumping around in front of the camera...? I "just" need to find a good way to translate the body movements to MIDI events. :smiley: ).

    oh, I see. so with 8plus (ios 14.2) I wont be able to use both hand tracking + truedepth :confused:

    • I think all midi routing possibilities Musikraken offers are very clear after reading the manual. however, a detailed-content video would definitely be great :smile: people usually prefer videos over manuals these days :blush:
  • I implemented the fallback to use the "normal" camera for hand tracking if no TrueDepth camera is available today, so you should be able to use hand tracking on iPhone 8 Plus after I release it. But first I want to make hand tracking a bit more interesting when used without the TrueDepth sensor (it also gives me the position of each finger, so I will try to find nice parameters that I can compute from that... :smiley: ).

  • @Snarp welcome to the forum.
    Just so you know, my brief description of your app was meant to be all positive.
    😊

  • @CracklePot said:
    @Snarp welcome to the forum.
    Just so you know, my brief description of your app was meant to be all positive.
    😊

    Haha, thanks! Well, it IS deep, weird and experimental... (but you can also do "normal" things with it...)

  • @Snarp said:
    I implemented the fallback to use the "normal" camera for hand tracking if no TrueDepth camera is available today, so you should be able to use hand tracking on iPhone 8 Plus after I release it. But first I want to make hand tracking a bit more interesting when used without the TrueDepth sensor (it also gives me the position of each finger, so I will try to find nice parameters that I can compute from that... :smiley: ).

    I dont really know what I was researching :grimace: but found these
    https://developer.apple.com/documentation/arkit/ardepthdata
    https://developer.apple.com/documentation/arkit/ardepthdata/3566296-depthmap
    -as I understand :smiley: - seems like there is also another way to create depth data
    I wish I could help! its just the most amazing tool for my workflow! maybe I should buy a new iphone asap! :#

  • @cazel said:

    @Snarp said:
    I implemented the fallback to use the "normal" camera for hand tracking if no TrueDepth camera is available today, so you should be able to use hand tracking on iPhone 8 Plus after I release it. But first I want to make hand tracking a bit more interesting when used without the TrueDepth sensor (it also gives me the position of each finger, so I will try to find nice parameters that I can compute from that... :smiley: ).

    I dont really know what I was researching :grimace: but found these
    https://developer.apple.com/documentation/arkit/ardepthdata
    https://developer.apple.com/documentation/arkit/ardepthdata/3566296-depthmap
    -as I understand :smiley: - seems like there is also another way to create depth data
    I wish I could help! its just the most amazing tool for my workflow! maybe I should buy a new iphone asap! :#

    also this https://developer.apple.com/documentation/arkit/creating_a_fog_effect_using_scene_depth
    just hope these are not related to the truedepth cam :#

  • @cazel said:
    also this https://developer.apple.com/documentation/arkit/creating_a_fog_effect_using_scene_depth
    just hope these are not related to the truedepth cam :#

    :smiley: sorry, I think all of these do not work on iPhone 8 Plus. I have a similar problem in the Android version of the app (not released yet, but coming soon), where they try to estimate the depth, but it is just made for hiding things in a static scene, not for fast moving objects (like hands). And I will not implement hand tracking on Android while the feature is still in Alpha state (and still very experimental, as is typical of Google). So you are lucky on iOS, because the hand tracking feature already works quite well (at least on my phone).

    But you should already be able to do similar things when using hand tracking instead of the TrueDepth camera (once my next release comes out). If you for example point the camera at you instead of upwards like in the video, you should also be able to generate notes by moving your hand up-down and modulate by moving sideways. And as I wrote previously, I will try to add a few extra features, like for example computing the average distance between the finger tips, so that you can also generate events by moving the fingers closer together or opening the hand more... But I need to test this first, I will try to work on this tomorrow...

  • @Snarp said:

    @cazel said:
    also this https://developer.apple.com/documentation/arkit/creating_a_fog_effect_using_scene_depth
    just hope these are not related to the truedepth cam :#

    :smiley: sorry, I think all of these do not work on iPhone 8 Plus. I have a similar problem in the Android version of the app (not released yet, but coming soon), where they try to estimate the depth, but it is just made for hiding things in a static scene, not for fast moving objects (like hands). And I will not implement hand tracking on Android while the feature is still in Alpha state (and still very experimental, as is typical of Google). So you are lucky on iOS, because the hand tracking feature already works quite well (at least on my phone).

    But you should already be able to do similar things when using hand tracking instead of the TrueDepth camera (once my next release comes out). If you for example point the camera at you instead of upwards like in the video, you should also be able to generate notes by moving your hand up-down and modulate by moving sideways. And as I wrote previously, I will try to add a few extra features, like for example computing the average distance between the finger tips, so that you can also generate events by moving the fingers closer together or opening the hand more... But I need to test this first, I will try to work on this tomorrow...

    great news! thanks for taking the time to explain everything :blush:
    just got the MusiKraken and cant wait to try it!

  • @cazel said:
    great news! thanks for taking the time to explain everything :blush:
    just got the MusiKraken and cant wait to try it!

    Thanks! I updated the hand tracking today so that I could control anything with each separate finger, but it was overkill :neutral: and too chaotic, so I will try to compute nice parameters instead tomorrow (hand rotation, finger distance etc.), before I release this version.
    And body tracking is also already working, but I also need to find out how to best use this before you can Dance-To-Make-Music™... (or Dance Music?).

  • edited December 2020

    The new update is now out. You should now be able to use Hand Tracking with a non-TrueDepth camera. And I added the following parameters:
    -Size: With this you can actually do something similar as when using the TrueDepth camera. This uses the distance between your wrist and the base of the middle finger to compute the size of your hand on the camera frame. Which is similar to the distance of your hand from the camera, but the inverse. The range is smaller than when using the TrueDepth camera and it is less accurate, but as long as your wrist is visible in front of the camera, it works quite well.
    -Open: This is the "openness" of your hand. It measures the distance between each finger tip and the thumb tip, so putting those fingers together decreases this value, while spreading the fingers increases it. Really useful for modulation!
    -Angle: This is the angle between the middle finger and the x-axis of the camera, so waving towards the left gives you values down to -1.0, while waving to the right goes up to 1.0.
    -Tilt: This is probably the least accurate of the values, but the hand movement is easier to do than the angle-thing: Tilt the hand sideways.

    I hope it is fast enough on older devices, I so far could only test this on an iPhone XS (my older iPhone 6 doesn't support iOS 14 anymore), but there it works surprisingly well...

  • @Snarp The latest update seem pretty awesome. Got to ask though: what's with the huge keyboard and no setting for ditto. The picture shows an iPad Air 3, iPhone SE and an Akai mpk mini. The Akai have small keys, I give you that, but look at the iPad, I can probably play with my foot and not miss a key (joking). Thanks again for the update and keep'em coming! :D

  • edited December 2020

    @Pxlhg said:
    @Snarp The latest update seem pretty awesome. Got to ask though: what's with the huge keyboard and no setting for ditto. The picture shows an iPad Air 3, iPhone SE and an Akai mpk mini. The Akai have small keys, I give you that, but look at the iPad, I can probably play with my foot and not miss a key (joking). Thanks again for the update and keep'em coming! :D

    You can change the key width in the keyboard settings, but yes, you are right, the default values for iPad are wrong! I have a quite complex formula here to scale nicely from small screen sizes to large ones, that also takes the screen DPI into consideration (because I need to support every screen size possible, as the app also runs on Android, Windows and Mac). But I need to change it and test on a few more devices better.

    For the key height, there will soon (hopefully? Depends on how much time I have) be the option of having two or more of those controls simultaneously on larger screens. But I haven't defined the details of that yet, as the screen size computations get even more complex (= which screen sizes do I support, and how many of those controls can there be...).

    And playing with my foot sounds interesting, I will try this! :smiley:

  • @Snarp just introduced auv3 support. Cool!

  • @Snarp said:
    And playing with my foot sounds interesting, I will try this! :smiley:

    :D
    I found the setting and that got a bit better, thanks. Maybe the cogwheel for keyboard settings should be on the keyboard page? The little tab that actually says keyboard :) could instead have, say settings. Just a thought.

  • @Pxlhg said:

    @Snarp said:
    And playing with my foot sounds interesting, I will try this! :smiley:

    :D
    I found the setting and that got a bit better, thanks. Maybe the cogwheel for keyboard settings should be on the keyboard page? The little tab that actually says keyboard :) could instead have, say settings. Just a thought.

    I did "hide" the settings for the keyboard in the editor on purpose :wink: . I only put the settings for the controls that might need to be changed while playing on the main page (so far only the chord pad and the AudioUnit modules), all the other ones need to be changed in the editor. Because adding cogwheels to every control on the main screen might get pretty chaotic :smiley: .

    But yes, I need a better introduction in the app so that the users know where to find things.

    The tab says "keyboard" by the way because that is the default name of this control. This way you can differentiate if you have multiple active keyboards. You can change this name in the settings. And you can change the order of these tabs, plus the order of the top-bar things by dragging them to another place. Which is something that most users will probably only discover by accident so far, so I definitely need to create a video about how to use this thing :blush: .

  • @Snarp said:

    @Pxlhg said:

    @Snarp said:
    And playing with my foot sounds interesting, I will try this! :smiley:

    :D
    I found the setting and that got a bit better, thanks. Maybe the cogwheel for keyboard settings should be on the keyboard page? The little tab that actually says keyboard :) could instead have, say settings. Just a thought.

    I did "hide" the settings for the keyboard in the editor on purpose :wink: . I only put the settings for the controls that might need to be changed while playing on the main page (so far only the chord pad and the AudioUnit modules), all the other ones need to be changed in the editor. Because adding cogwheels to every control on the main screen might get pretty chaotic :smiley: .

    But yes, I need a better introduction in the app so that the users know where to find things.

    The tab says "keyboard" by the way because that is the default name of this control. This way you can differentiate if you have multiple active keyboards. You can change this name in the settings. And you can change the order of these tabs, plus the order of the top-bar things by dragging them to another place. Which is something that most users will probably only discover by accident so far, so I definitely need to create a video about how to use this thing :blush: .

    I understand. I'm very new to the app and have only used it shortly to try figuring out what it can do. I don't have that camera to do what really promoted the app but thought it looked interesting. I have yet to use it in a creative process but will try that soon. All good. :)

  • @Pxlhg said:
    I understand. I'm very new to the app and have only used it shortly to try figuring out what it can do. I don't have that camera to do what really promoted the app but thought it looked interesting. I have yet to use it in a creative process but will try that soon. All good. :)

    Did you try the "Hand Tracking" control? It also works on devices without the TrueDepth camera, as long as you have iOS 14. Its a bit different to use than the TrueDepth module though. If you want to do the same as with the TrueDepth camera (= make music by moving your hand up and down), you can either:

    1. Use the "size" port. Set it to a range of about 0.1 to 0.5 (depending on your hand size, so you might need to adjust this) and invert the value to get a similar result as the TrueDepth module. The size value tries to measure your hand size by getting the distance between your wrist and the base of the middle finger, so it only works if both are visible from the camera.

    2. You can also point the camera at you and use the y-port. The x- and y- values track your hand position relative to the camera image, so moving the hand up increases this value, while moving it down decreases it. Or move it sideways to change the x value. This works best if only one hand is visible from the camera, so put the other one in your trowser pocket or behind your back if possible. I could actually track both hands at the same time, and maybe I will sometime, but so far I had the problem of detecting which one is hand nr. 1 and which one hand nr. 2, which makes it more difficult to assign them to instruments. I will try to find a good solution to this.

    But you can also combine these, of course, or with any of the other ports. For example use the y-port for creating notes, and the x-port for modulation....

  • Oh man I would love to give this a go, but I have absolutely no idea if it will work on my setup. iPhone 11/iPad Pro 10.5” latest iOS, ASUS router, Windows 10 Pro 2002, Cubase 10.5 Pro.

  • edited December 2020

    @ChrisG said:
    Oh man I would love to give this a go, but I have absolutely no idea if it will work on my setup. iPhone 11/iPad Pro 10.5” latest iOS, ASUS router, Windows 10 Pro 2002, Cubase 10.5 Pro.

    Hi @ChrisG , I iPhone 11 should have a TrueDepth camera, and even a faster chip than the iPhone XS I use in the demos above, so you should be able to use every feature the app has at the moment. The iPad Pro 10.5” probably doesn't have TrueDepth (I think they started adding it to the 11'' ones), but you should be able to use everything else there.

    If you want to send the MIDI data to Cubase via your WiFi, you can use the free rtpMIDI driver by Tobias Erichsen:
    https://www.tobias-erichsen.de/software/rtpmidi.html
    (there are other options, but this one works the best, I think).

    Both your computer and your iOS device need to be in connected to the same Wifi in this case. Create a CoreMIDI Network module in MusiKraken, and now you should see your phone in rtpMIDI on your computer. Once you connect to it, you can use the port in Cubase. I also use rtpMIDI when making music on my Windows computer in combination with iOS, because it is the easiest to setup. I never tried to send the MIDI data from iOS to Windows via the lightning cable, there seem to be some solutions, but they sounded a bit hacky (it works nicely when connecting Android with Windows by cable, though). Maybe I should try to find a good solution for that as well...

  • @Snarp

    Thanks, sounds good, I’ve used the Tobias rtpMidi before but I can’t remember what for (likely another iOS app:-). Will give this a go, always fun to try out different ways of interacting with synths/sample libs.

  • Just a quick info if you have a new iPad Pro: The current version of MusiKraken that is in the store will look bad on your device. The reason is that the new version of Unity, that I used, returns wrong DPI values for newer iPad Pro devices (and I obviously forgot to test it on newer iPad Pros after updating :-( ).

    I sent a bug report to Unity and will create a work around for now, but the Apple app store is closed for releases until the 27th, so it will take a few days until it works correctly again.

    For all other devices, there shouldn't be any problems...

Sign In or Register to comment.