Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

New Midi app MusiKraken

12346

Comments

  • Hi @Kouskb , thanks for using the app!

    • I implemented a MIDI to value converter two weeks ago! It will be included in the next release. Currently you can filter and convert Control Change events to values, Channel Pressure, and notes (you can set a note range, and the note values of Note-On events will be converted to a value between 0.0 and 1.0). I will also add pitch bend conversion later, I want to generally change these settings, so it will probably be in a later release... Do you need to convert anything else?

    • Yes, the running-in-the-background is something that I have always pushed further down my roadmap, because I had more fun things to do :smiley: . But it probably isn't a very large change, just a lot of testing, so I will now really add it as soon as possible... There are also a lot of other iOS-only changes that I want to do, like adding support for Audio Effects AudioUnits...

    • I will look into the Audiobus to MusiKraken switching problem. Not sure what the problem is here. Can you send me a screenshot of the greyed out swipe-bar? Maybe then I can reproduce it...

  • @Kouskb said:
    Hi @Snarp, thanks for providing us this powerful app! With the new features like Threshold and Beat, I'd use it much more as a versatile MIDI router able to process motion-based input than for experimenting with camera-based input.

    My project is to build a minimal "busker" setup with only my -acoustic- guitar, my iphone, an audio interface and a vibration speaker. No pedal. Then it's about getting the most out of the in-built sensors on the iphone, and as I'm playing guitar and singing, gestures and face expressions are not so actual but moving the device is.

    Musikraken is to my knowledge the only scripting-free ios app that integrates both sensors and routing with sufficient versatility. Reproducing this functionality by combining say Holon.ist or TC-data on the sensor side with Audiobus, MidiFire or MidiFlow on the routing side would require a fair amount of scripting in StreamByter or Mozaic and tens of times the effort. So I'm all for it... if it were not for a couple of critical shortcomings:

    • please make it run in the background! the camera-based stuff will likely never do, but the motion sensors and MIDI routing should (TC-data does run in the background for instance, and so do midi routing apps)
    • it's too cumbersome to switch between Audiobus and Musikraken. Don't know if it's specific to my iphone, but the shortcut to the iOS app switching menu from Musikraken is buggy (the bar I should swipe up is inactivated/greyed out, have to tap multiple times to get it black/activated again before I can switch to other apps. That's a show stopper...). Ideally, one could do this handsfree via MIDI through Midi Learn in Audiobus if ever Musikraken was supported as MIDI input/fx/output IAA app there (would be awesome!!!)
    • a MIDI to value converter would be very handy in a number of cases. Add to this the -already functional- possibility of having Mozaic in AU blocks to complement for any missing feature, and I don't see what Musikraken cannot do...

    Your busking rig idea looks similar to what i built:

    All sound is coming from the guitar itself; there is no external speaker or sound coming from the iPhone. The video doesn’t capture how good it actually sounds in the room. I have an irig, a mini amp, and actuators inside the sound hole. Lithium ion battery providing all the juice.

    I am interested in how you are thinking of using musikraken for your use case. I currently use this little genki ring to change settings on the fly but am always looking for improvements!

  • edited April 2022

    @Snarp said:
    Hi @Kouskb , thanks for using the app!

    • I implemented a MIDI to value converter two weeks ago! It will be included in the next release. Currently you can filter and convert Control Change events to values, Channel Pressure, and notes (you can set a note range, and the note values of Note-On events will be converted to a value between 0.0 and 1.0). I will also add pitch bend conversion later, I want to generally change these settings, so it will probably be in a later release... Do you need to convert anything else?

    • Yes, the running-in-the-background is something that I have always pushed further down my roadmap, because I had more fun things to do :smiley: . But it probably isn't a very large change, just a lot of testing, so I will now really add it as soon as possible... There are also a lot of other iOS-only changes that I want to do, like adding support for Audio Effects AudioUnits...

    • I will look into the Audiobus to MusiKraken switching problem. Not sure what the problem is here. Can you send me a screenshot of the greyed out swipe-bar? Maybe then I can reproduce it...

    @Snarp Thanks for the answer! What a committed developper :-)

    • Great for the MIDI to value feature!!! just note and CC to value should do the job. That opens a ton of possibilities. Basically we're coming close to the level of flexibility that scripting in MidiFire/Streambyter, Mozaic or Holon.ist/Javascript provides, but with a much more user-friendly flowchart approach.

    • Thanks for making running in the background a priority, that will be a game changer for those not owning/not wanting to use multiple devices. About iOS-only changes, adding IAA support to enable integration in Audiobus would be nice.

    • Here are two screen shots, one with normal (black) bar and one with bar greyed out. I'm using an iphone 12 without menu button, so app switching has to be done by swiping up this bar. It seems it is black for a couple of seconds, and becomes deactivated and greyed out. Sometimes a gentle swipe suffices to activate it again (so I'd use two swipes instead of one to get to the app switcher), but sometimes Musikraken is pretty stubborn and I've to tease it a while before the bar gets black again.


  • @lukesleepwalker said:

    @Kouskb said:
    Hi @Snarp, thanks for providing us this powerful app! With the new features like Threshold and Beat, I'd use it much more as a versatile MIDI router able to process motion-based input than for experimenting with camera-based input.

    My project is to build a minimal "busker" setup with only my -acoustic- guitar, my iphone, an audio interface and a vibration speaker. No pedal. Then it's about getting the most out of the in-built sensors on the iphone, and as I'm playing guitar and singing, gestures and face expressions are not so actual but moving the device is.

    Musikraken is to my knowledge the only scripting-free ios app that integrates both sensors and routing with sufficient versatility. Reproducing this functionality by combining say Holon.ist or TC-data on the sensor side with Audiobus, MidiFire or MidiFlow on the routing side would require a fair amount of scripting in StreamByter or Mozaic and tens of times the effort. So I'm all for it... if it were not for a couple of critical shortcomings:

    • please make it run in the background! the camera-based stuff will likely never do, but the motion sensors and MIDI routing should (TC-data does run in the background for instance, and so do midi routing apps)
    • it's too cumbersome to switch between Audiobus and Musikraken. Don't know if it's specific to my iphone, but the shortcut to the iOS app switching menu from Musikraken is buggy (the bar I should swipe up is inactivated/greyed out, have to tap multiple times to get it black/activated again before I can switch to other apps. That's a show stopper...). Ideally, one could do this handsfree via MIDI through Midi Learn in Audiobus if ever Musikraken was supported as MIDI input/fx/output IAA app there (would be awesome!!!)
    • a MIDI to value converter would be very handy in a number of cases. Add to this the -already functional- possibility of having Mozaic in AU blocks to complement for any missing feature, and I don't see what Musikraken cannot do...

    Your busking rig idea looks similar to what i built:

    All sound is coming from the guitar itself; there is no external speaker or sound coming from the iPhone. The video doesn’t capture how good it actually sounds in the room. I have an irig, a mini amp, and actuators inside the sound hole. Lithium ion battery providing all the juice.

    I am interested in how you are thinking of using musikraken for your use case. I currently use this little genki ring to change settings on the fly but am always looking for improvements!

    @lukesleepwalker sweet rig! and with a carbon fiber guitar on top of that! I'll PM you to share thoughts and avoid hi-jacking this thread.

    Now about my use of Musikraken, I'd just twist/tilt/shake the guitar and the phone attached to it to trigger CC commands. As I want to control both midi accompaniment, looping and effects, I need to assign CCs to a variety of motion patterns (shakes, zero-crossings and double taps triggering different CCs depending on yaw angle for instance). You can do this in Musikraken by using combinations of blocks such as value to MIDI, MIDI to value (soon :-)), Speed, Threshold and Beat.

    Yes I could do this through coding in Mozaic, or in MidiFire/Streambyter, or directly in Holon.ist (the latter would be the most direct competitor to Musikraken I think). I even considered getting this done in an Arduino Nano BLE sense (if you ever heard of this, a DIY genki ring if you like :-)) that would send all the customized MIDI data from motion sensing I'd need directly to the iphone, but I realized I'm coding enough at work and prefer to relax with a graphical approach in my scarce free time :-). I'll keep this for last resort fixes by running Mozaic in AU blocks in Musikraken.

    If you'd like to use the genki ring, you'd use it to trigger notes/CCs in a carefully chosen way and send this as midi input to Musikraken that would convert the note/CCs to value, thus opening for combinations with other sensors for conditionally triggering whatever you like and get as many CC triggers as you need with only 3D motion sensing.

  • @Kouskb said:

    @lukesleepwalker said:

    @Kouskb said:
    Hi @Snarp, thanks for providing us this powerful app! With the new features like Threshold and Beat, I'd use it much more as a versatile MIDI router able to process motion-based input than for experimenting with camera-based input.

    My project is to build a minimal "busker" setup with only my -acoustic- guitar, my iphone, an audio interface and a vibration speaker. No pedal. Then it's about getting the most out of the in-built sensors on the iphone, and as I'm playing guitar and singing, gestures and face expressions are not so actual but moving the device is.

    Musikraken is to my knowledge the only scripting-free ios app that integrates both sensors and routing with sufficient versatility. Reproducing this functionality by combining say Holon.ist or TC-data on the sensor side with Audiobus, MidiFire or MidiFlow on the routing side would require a fair amount of scripting in StreamByter or Mozaic and tens of times the effort. So I'm all for it... if it were not for a couple of critical shortcomings:

    • please make it run in the background! the camera-based stuff will likely never do, but the motion sensors and MIDI routing should (TC-data does run in the background for instance, and so do midi routing apps)
    • it's too cumbersome to switch between Audiobus and Musikraken. Don't know if it's specific to my iphone, but the shortcut to the iOS app switching menu from Musikraken is buggy (the bar I should swipe up is inactivated/greyed out, have to tap multiple times to get it black/activated again before I can switch to other apps. That's a show stopper...). Ideally, one could do this handsfree via MIDI through Midi Learn in Audiobus if ever Musikraken was supported as MIDI input/fx/output IAA app there (would be awesome!!!)
    • a MIDI to value converter would be very handy in a number of cases. Add to this the -already functional- possibility of having Mozaic in AU blocks to complement for any missing feature, and I don't see what Musikraken cannot do...

    Your busking rig idea looks similar to what i built:

    All sound is coming from the guitar itself; there is no external speaker or sound coming from the iPhone. The video doesn’t capture how good it actually sounds in the room. I have an irig, a mini amp, and actuators inside the sound hole. Lithium ion battery providing all the juice.

    I am interested in how you are thinking of using musikraken for your use case. I currently use this little genki ring to change settings on the fly but am always looking for improvements!

    @lukesleepwalker sweet rig! and with a carbon fiber guitar on top of that! I'll PM you to share thoughts and avoid hi-jacking this thread.

    Now about my use of Musikraken, I'd just twist/tilt/shake the guitar and the phone attached to it to trigger CC commands. As I want to control both midi accompaniment, looping and effects, I need to assign CCs to a variety of motion patterns (shakes, zero-crossings and double taps triggering different CCs depending on yaw angle for instance). You can do this in Musikraken by using combinations of blocks such as value to MIDI, MIDI to value (soon :-)), Speed, Threshold and Beat.

    Yes I could do this through coding in Mozaic, or in MidiFire/Streambyter, or directly in Holon.ist (the latter would be the most direct competitor to Musikraken I think). I even considered getting this done in an Arduino Nano BLE sense (if you ever heard of this, a DIY genki ring if you like :-)) that would send all the customized MIDI data from motion sensing I'd need directly to the iphone, but I realized I'm coding enough at work and prefer to relax with a graphical approach in my scarce free time :-). I'll keep this for last resort fixes by running Mozaic in AU blocks in Musikraken.

    If you'd like to use the genki ring, you'd use it to trigger notes/CCs in a carefully chosen way and send this as midi input to Musikraken that would convert the note/CCs to value, thus opening for combinations with other sensors for conditionally triggering whatever you like and get as many CC triggers as you need with only 3D motion sensing.

    Right on, thanks for the explanation!

  • @Kouskb said:

    @Snarp Thanks for the answer! What a committed developper :-)

    • Great for the MIDI to value feature!!! just note and CC to value should do the job. That opens a ton of possibilities. Basically we're coming close to the level of flexibility that scripting in MidiFire/Streambyter, Mozaic or Holon.ist/Javascript provides, but with a much more user-friendly flowchart approach.

    • Thanks for making running in the background a priority, that will be a game changer for those not owning/not wanting to use multiple devices. About iOS-only changes, adding IAA support to enable integration in Audiobus would be nice.

    • Here are two screen shots, one with normal (black) bar and one with bar greyed out. I'm using an iphone 12 without menu button, so app switching has to be done by swiping up this bar. It seems it is black for a couple of seconds, and becomes deactivated and greyed out. Sometimes a gentle swipe suffices to activate it again (so I'd use two swipes instead of one to get to the app switcher), but sometimes Musikraken is pretty stubborn and I've to tease it a while before the bar gets black again.

    Ah, you meant the "home indicator". Thats actually not a bug, that's a feature :wink: . MusiKraken uses deferred system gestures on the lower edge (a setting in iOS called preferredScreenEdgesDeferringSystemGestures). The reason for this is that without it, it is almost impossible to play a song on the keyboard on an iPhone, because every time you slide (only a little, accidental slide is enough) in the area where the home indicator is, the app will go to the background and the app-switching-screen will appear. Garage Band for example also defers the system gestures because of this. So instead of once, you need to swipe upwards twice in a row. Once to get it from grey to black, and then again to get to the app switcher...

    IAA was already deprecated by Apple when I started implementing MusiKraken, so I always assumed it would vanish soon and never looked into it. Maybe I should still do that... I also want to find time to at least try to run MusiKraken as an AudioUnit. I assume it will be problematic and it won't be easy to setup, that is why I haven't spent too much time on this yet, but it would be useful if it worked.

    The reason why it might not be possible is that I use the game engine Unity for the GUI and a few other things in MusiKraken (combined with many native plugins so that I can replace anything that doesn't work that nicely out of the box). At least according to the documentation Unity does not work when it is not running full screen, but as I haven't tried it yet, I can't say if there is a workaround for that... But I have a full and easy to use 3D and physics engine in there, so expect so see some fun new features in the future :smiley: .

  • @Snarp said:

    @Kouskb said:

    @Snarp Thanks for the answer! What a committed developper :-)

    • Great for the MIDI to value feature!!! just note and CC to value should do the job. That opens a ton of possibilities. Basically we're coming close to the level of flexibility that scripting in MidiFire/Streambyter, Mozaic or Holon.ist/Javascript provides, but with a much more user-friendly flowchart approach.

    • Thanks for making running in the background a priority, that will be a game changer for those not owning/not wanting to use multiple devices. About iOS-only changes, adding IAA support to enable integration in Audiobus would be nice.

    • Here are two screen shots, one with normal (black) bar and one with bar greyed out. I'm using an iphone 12 without menu button, so app switching has to be done by swiping up this bar. It seems it is black for a couple of seconds, and becomes deactivated and greyed out. Sometimes a gentle swipe suffices to activate it again (so I'd use two swipes instead of one to get to the app switcher), but sometimes Musikraken is pretty stubborn and I've to tease it a while before the bar gets black again.

    Ah, you meant the "home indicator". Thats actually not a bug, that's a feature :wink: . MusiKraken uses deferred system gestures on the lower edge (a setting in iOS called preferredScreenEdgesDeferringSystemGestures). The reason for this is that without it, it is almost impossible to play a song on the keyboard on an iPhone, because every time you slide (only a little, accidental slide is enough) in the area where the home indicator is, the app will go to the background and the app-switching-screen will appear. Garage Band for example also defers the system gestures because of this. So instead of once, you need to swipe upwards twice in a row. Once to get it from grey to black, and then again to get to the app switcher...

    IAA was already deprecated by Apple when I started implementing MusiKraken, so I always assumed it would vanish soon and never looked into it. Maybe I should still do that... I also want to find time to at least try to run MusiKraken as an AudioUnit. I assume it will be problematic and it won't be easy to setup, that is why I haven't spent too much time on this yet, but it would be useful if it worked.

    The reason why it might not be possible is that I use the game engine Unity for the GUI and a few other things in MusiKraken (combined with many native plugins so that I can replace anything that doesn't work that nicely out of the box). At least according to the documentation Unity does not work when it is not running full screen, but as I haven't tried it yet, I can't say if there is a workaround for that... But I have a full and easy to use 3D and physics engine in there, so expect so see some fun new features in the future :smiley: .

    Got it!

    Sorry for the sloppy terminology :-) As I said, swiping twice would be ok but sometimes I really have to insist, don't know why. That takes a couple of seconds, quite critical when you're playing. Would be nice if we had the option to turn deferring gestures off. As for me, playing on the GUI keyboard is only for debugging, and the layout can anyway be changed to free the lower edge , so I would definitely prefer easy switching.

    IAA is deprecated but too widely used to vanish anytime soon and a low-hanging fruit in terms of implementation, I believe. AU on the other end would be more powerful and future-proof, but in my understanding that would imply an intrusive adaptation of the app architecture.

    In any case, both require running in the background for any practical use... I see the dependency limitation there.

  • @Kouskb said:
    Got it!

    Sorry for the sloppy terminology :-) As I said, swiping twice would be ok but sometimes I really have to insist, don't know why. That takes a couple of seconds, quite critical when you're playing. Would be nice if we had the option to turn deferring gestures off. As for me, playing on the GUI keyboard is only for debugging, and the layout can anyway be changed to free the lower edge , so I would definitely prefer easy switching.

    IAA is deprecated but too widely used to vanish anytime soon and a low-hanging fruit in terms of implementation, I believe. AU on the other end would be more powerful and future-proof, but in my understanding that would imply an intrusive adaptation of the app architecture.

    In any case, both require running in the background for any practical use... I see the dependency limitation there.

    I will try to add a setting to disable deferring the system gestures. Should be easy to implement, but I need to test it first.

    The basics for running the app in the background are already implemented. Motion sensors continue to work nicely while the app is in the background, and it also continues to communicate with external bluetooth hardware in this mode.

    The only problem is that the app currently takes up too much memory in the background and gets killed by the system soon after I start another app. :neutral: So I need to find out how to get rid of as much as possible while the app is in the background, which means I have to change a few things. But I will find a way...

  • I added Apple Watch support to MusiKraken. You can use the motion sensors, heart rate, an XY-pad and the digital crown of the watch to mix with the other features of MusiKraken.

    MusiKraken now also supports background audio. That means you can route MIDI through MusiKraken while another app is in the foreground, or make music while the display is off. You cannot use any of the camera features and (obviously) no touch while the app is in the background, but you can use the motion sensors, microphone, Apple Watch and external MIDI devices that are connected to MusiKraken.

    And there is now also a MIDI-to-value converter that can be used to convert MIDI events from external devices into the internal MusiKraken system.

    And you can now create your own scales by selecting the notes.

    I entered MusiKraken into the MIDI Innovation Awards! So please vote for it!
    https://www.midi.org/component/zoo/category/non-commercial-software?Itemid=1423

    @Kouskb The setting to disable deferring the system gestures is already implemented, but not yet in the released version, it will come in the next release...

  • Does anyone here use MusiKraken without the TrueDepth camera on their iOS device?

    I’m trying to decide if my iPad 6 would be good enough to use most of the Input Modules.. 🤷‍♂️
    Seems like iPad Pros are the only iPads with TrueDepth cameras..

    Important: Please note that some of the modules only work on iOS devices with specific hardware: ARKit Face Tracking and the TrueDepth module need a TrueDepth front camera, and Hand and Body Joint Tracking needs at least iOS 14, and might be too slow on older devices.

    Input Modules:
    -Keyboard: MPE support, scale highlighting, control multiple values by sliding and touch pressure or finger radius.
    -Chords Pad: Play chords and its inversions of a selected scale.
    -Touchpad: Slide multiple fingers on a touchpad to control up to 5 values simultaneously.
    -ARKit Face Tracking: Uses the TrueDepth camera to generate notes or control parameters using your mouth, eyes, tongue or by moving your head.
    -Mouth Tracking: Uses the normal camera to control parameters using your mouth.
    -TrueDepth: Use the depth signal of your camera to generate MIDI data by moving your hand (or anything else) in front of the device.
    -Hand Tracking: Uses the camera to convert your hand position and simple hand gestures to MIDI. You can use one or two hands and it can also use the TrueDepth sensor if available.
    -Body Tracking: Uses the camera to track hand, feet and head positions of up to two persons.
    -Motion Sensors: Uses the combination of Accelerometer, Gyroscope and Magnetometer to detect the current rotation of the device.
    -Accelerometer: Measures the acceleration of the device.
    -Apple Watch: Use the motion sensors, heart rate, touch and digital crown.
    -Microphone: Detects pitch and volume using the microphone.
    -External Input Device: Receives MIDI events of connected input devices

  • @royor said:
    Does anyone here use MusiKraken without the TrueDepth camera on their iOS device?

    I’m trying to decide if my iPad 6 would be good enough to use most of the Input Modules.. 🤷‍♂️
    Seems like iPad Pros are the only iPads with TrueDepth cameras..

    Hi @royor , it depends on what you want to do with the app. Without the TrueDepth sensor, you still get an MPE keyboard, chords pad, touchpad, hand tracking (with only simplified distance computations based on the size of your hand, but everything else should work the same, so you still have the x- and y-axis and hand gestures), mouth tracking (face rotation without TrueDepth can only be detected in 45 degree steps on iOS, so quite useless, so I only added the mouth tracking here), body joint tracking, motion sensors (these are probably more useful on an iPhone, but depends what you want to do with it), microphone, (no Apple Watch support for iPads), all the effects and output modules...

    So you can use almost everything, except the distance computations and the face rotation (and no eye and tongue tracking).

    And there will be more soon :D . And the app is on sale at the moment (40% off).

  • @Snarp just purchased (thx for the discount) without much of a use case in mind but I totally dig what you are doing with this app and want to make a small investment in your progress.

  • @Snarp Thanks so much for your detailed answer.. Bought!
    This looks like it’s gonna be a blast.. off to watch a few videos + then play time.. 😁

  • @lukesleepwalker : Thanks!! I hope you will find a nice use case for it!
    @royor : Thanks!! If you have any questions on how to create a specific setup, just ask here or send me a mail.

    And everyone, voting is still ongoing, please vote for MusiKraken at the MIDI Innovation Awards: https://www.midi.org/component/zoo/category/non-commercial-software?Itemid=1423

  • MusiKraken is a MIDI Innovation Awards finalist!! Thanks everyone for voting!

    The app is one of the three finalists in the Non-Commercial Software (= low budget) section. In the commercial section, Animoog Z and GeoShred are in the finals, so the iOS music community is very strong!

    Thank you!!! (I extended the 40% off sale of MusiKraken for another week because of this...)

  • Congrats!! Just bought musikraken , would be great hAving more examples and setups included!

  • @Snarp said:
    MusiKraken is a MIDI Innovation Awards finalist!! Thanks everyone for voting!

    The app is one of the three finalists in the Non-Commercial Software (= low budget) section. In the commercial section, Animoog Z and GeoShred are in the finals, so the iOS music community is very strong!

    Congratulations.


  • Thanks again, everyone! MusiKraken won at the MIDI Innovation Awards!!!

  • @Snarp said:

    Thanks again, everyone! MusiKraken won at the MIDI Innovation Awards!!!

    congratulations!!

  • @Snarp said:

    Thanks again, everyone! MusiKraken won at the MIDI Innovation Awards!!!

    Congratulations.

  • @Snarp inwas inspired to buy this app. it is amazing, i don’t know how i missed it. i was a little on the fence because i have purchased other midi apps, and the setup is so tedious. i just want to play, not do math. the setup is so fast and intuitive.
    very cool
    i did have one quick question though.

    in the arpeggiator . what do the adjustable verticals bars do. normally they are the gate or velocity in other arpeggiator . but what do these do? can i assign them to something? if so, how?
    thanks

  • @eross said:
    in the arpeggiator . what do the adjustable verticals bars do. normally they are the gate or velocity in other arpeggiator . but what do these do? can i assign them to something? if so, how?
    thanks

    Thanks!
    Yes, they are the velocity of each note played in the arpeggiator. You can tap or slide on the bar to change the values, or you can press the bottom numbers to deactivate one of them completely so that a beat is skipped. Currently you cannot make a beat longer, but there will be a more complex "pattern generator" that will be able to do that in the (hopefully near) future.

  • @Snarp Congratulations on MusiKraken winning at the MIDI Innovation Awards. 👏

  • @Snarp Congrats for the award, well deserved!

  • Congrats @Snarp! Well deserved!

    I'm sure you have lots of ideas for the app, but here's one for you: it would be cool if there was a way to orchestrate a progress of actions through a single MIDI gesture. So, for instance, combined with other inputs/effects, if I twist the device a certain way it sends one note/CC; if I do it again, it sends the next note/CC; do it again, it sends the next one. etc

    Another idea: A very simple note filter. Listen for these 4-5 notes. When you hear one, do something...

  • @lukesleepwalker Great ideas, thanks! I am currently working on something in this direction, but I didn't think of creating a sequencer for actions yet, good idea!

    What I already did (not released yet) is to create a "Trigger" that can trigger actions. I need these for example for configurable buttons, for example for the game controller support (already working on iOS, but I still need to implement the Android version). Currently a trigger can only play notes or chords, or send other MIDI commands, but they could do a lot more, like, orchestrate a progress of actions?

    Thanks!

  • @Snarp said:
    @lukesleepwalker Great ideas, thanks! I am currently working on something in this direction, but I didn't think of creating a sequencer for actions yet, good idea!

    What I already did (not released yet) is to create a "Trigger" that can trigger actions. I need these for example for configurable buttons, for example for the game controller support (already working on iOS, but I still need to implement the Android version). Currently a trigger can only play notes or chords, or send other MIDI commands, but they could do a lot more, like, orchestrate a progress of actions?

    Thanks!

    Exactly! That would be really useful in a live performance.

  • You can now use Game Controllers as input devices for MusiKraken on iOS and Android!

    Assign chords to each button, or generate notes, control change, pitch bend or channel pressure events with the thumbsticks or analogue buttons.

    Playstation controllers (PS4 and PS5) also have accelerometers, changeable lights (you can assign a light to each button) and touchpads that support up to two fingers.

    Restrictions: Game Controllers cannot be used while the app is in the background. And if you have an Audio Unit view open, and tap into it, the Game Controller events will no longer be sent to MusiKraken. Simply close the Audio Unit view to get the events again (and as long as you do not interact with the Audio Unit, you can keep it open).

  • edited July 2022

    @Snarp said:

    You can now use Game Controllers as input devices for MusiKraken on iOS and Android!

    Assign chords to each button, or generate notes, control change, pitch bend or channel pressure events with the thumbsticks or analogue buttons.

    Playstation controllers (PS4 and PS5) also have accelerometers, changeable lights (you can assign a light to each button) and touchpads that support up to two fingers.

    Restrictions: Game Controllers cannot be used while the app is in the background. And if you have an Audio Unit view open, and tap into it, the Game Controller events will no longer be sent to MusiKraken. Simply close the Audio Unit view to get the events again (and as long as you do not interact with the Audio Unit, you can keep it open).

    How awesome is this new feature. I love how it can use the PS controller accelerometer to do drums too. This app just jumped up high on my wish list.

Sign In or Register to comment.