Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Advice for designing an iOS DAW

2»

Comments

  • i suggest "reverse engineering" an existing app and modify it to your specifications.

    just don't forget to totally redesign the gui so no one will be able to guess what app you reconstructed. ;)

  • @AudioGus said:

    @SevenSystems said:
    @tadat that's the stuff that my nightmares are made of and that I desperately try to avoid in my interfaces. 😬

    Moving small objects like notes directly with your finger doesn't work very well because your finger covers the very object you're trying to move.

    One could make it so that the first part of the ‘move’ (a tiny number of transparent milliseconds) simply offsets the finger from the object to be moved. Then moving begins to take effect with the finger now separate from the note. The brain locks on to the note as what is being moved and the finger is merely the controller. I did a mockup (with a programmers help) of this in Unity once with mouse control and bet it translates even better to touch. Would love for someone (cough, hello :) ) to try it.

    Something like this perhaps...

  • @AudioGus said:

    @AudioGus said:

    @SevenSystems said:
    @tadat that's the stuff that my nightmares are made of and that I desperately try to avoid in my interfaces. 😬

    Moving small objects like notes directly with your finger doesn't work very well because your finger covers the very object you're trying to move.

    One could make it so that the first part of the ‘move’ (a tiny number of transparent milliseconds) simply offsets the finger from the object to be moved. Then moving begins to take effect with the finger now separate from the note. The brain locks on to the note as what is being moved and the finger is merely the controller. I did a mockup (with a programmers help) of this in Unity once with mouse control and bet it translates even better to touch. Would love for someone (cough, hello :) ) to try it.

    Something like this perhaps...

    That's certainly interesting... cannot see any obvious problems with that right now, but I'm sick and my brain is operating at about 30% 😏

  • Yes, that's the proper way of moving hit objects - been around since the iElectribe on iPad One, which additionally used distance from 1st touch to scale parameters of dials.
    ps: to study gestures TC Data is probably the most accessible and complete reference.
    It's worth the price just for the learning experience ;)

  • edited August 2019

    @Telefunky said:
    Yes, that's the proper way of moving hit objects - been around since the iElectribe on iPad One, which additionally used distance from 1st touch to scale parameters of dials.
    ps: to study gestures TC Data is probably the most accessible and complete reference.
    It's worth the price just for the learning experience ;)

    Agreed, just about everything does that for dials and I do have TC Data. I am talking about applying it to note movement though. Do you have any examples of that specificly?

  • I mentioned TC Data for those who don't have it because it has about any combination of touch analysis and follow up control action.
    (number of hit points, initial time and location, movement of poin(s) in distance and angle over time)
    It shows the detail of control IOS provides over gestures, neatly sorted in selection lists.

    TC Data is about momentary generation of midi events, but one can do an imaginary transformation into different interface domains. At first glance the amount of options is overwhelming and almost confusing.

    The movement of notes on a piano roll is a simple example, but shows the complexity:

    Is it a single note or multiple notes ?
    Are notes supposed to keep their harmonic context, their time context ?
    Is movement free or is there a grid alignement ?
    What happens if the destination area already contains notes ?
    Will those be shifted, replaced, or do the moved notes overlap ?
    (no 2 notes can have the exact same position in midi, afaik)
    The moved selection of notes may be non continous, which adds complexity when handling the target area. (just a quickshot of ideas without deep meditation)

    The drag action as shown in the video requires a lot of space and precise finger movement.
    But it could be scaled in a way that it works even on a tiny phone screen.
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

  • edited August 2019

    yes yes yes. this is all very complex already if you want to realize that for a desktop with mouse and keyboard usage.

    a touchscreen device opens completely new areas of user interaction. the possibilities seem to be overwhelming at first but finally will trace down to some very few, very well thought paradigms. something that is shared by all apps.

    The first deep impact experience that the touchscreen usage brought, was pinch & zoom with the browser. that was cool! you could actually zoom into the area of interest!! revolutionary! unfortunately this cool thing has never taken its way into most apps (despite some graphics tools).

    but the current reality is completely another, that is, the static desktop versions just are ported nearly 1:1 onto the iOS platform. especially with many audio units. the popups and menus just were translated. these popups, b.t.w., locking the entire screen with all the controls, are another awfulness on iOS. I would forbid these if I ever could.

    but these minified static interfaces are not even made zoom-able, so that nobody actually can read the labels or actually hit these controls correctly.

    .. but more... the iOS users seem to be “trained” in the meantime, not even trying to zoom into, because they did that too often with zero results ..
    i have seen users claiming that everything is too small and unusable, because they did not even try to make a pinch gesture, which would have solved their problem.

  • @Telefunky said:
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

    Something like this? 😉

  • @tadat said:
    Wondering if anyone here might be able to provide some advice for designing my own iOS DAW. I’ve never designed an app, don’t know coding, etc... So obviously a lot to learn. I’m considering commissioning someone, too. Just know that I have very specific ideas I’m looking for in an iOS DAW, and have grown frustrated waiting for other iOS DAWs to incorporate what I need.

    I see iOS as the future of computer music, and am continually surprised how few options there are which fully bring together all of these forward-thinking developments... which to me is the whole point of all of this, incorporate them all into one app, so we spend less time thinking about how to make everything work together and more time making great new music.

    Aside from the obvious DAW staples, here’s a list of what this new DAW must incorporate:

    • MIDI and Audio
    • Side-chaining
    • MPE integration
    • Bluetooth MIDI
    • MIDI FX
    • Auv3
    • Integrated drum programmer
    • Integrated arpeggiator

    And at every level, should be designed with a touch interface in mind (you should be able to touch anything you see and manipulate right there, without having to dive into peripheral menus). And very much visually based (Icons, color hierarchy, etc.) and text only where it’s essential. It should be simple on the surface, but deep when you start diving in. It’s fundamental functions should be absolutely intuitive without having to read a manual/watch a video, etc.

    There are lot of great programs out there, and I’ve spent lots of time with each. But for my personal needs, this is my dream of the perfect iOS DAW. Frustrated in waiting for it to arrive, I’ve decided to try to make it happen myself. So......... if anyone has any advice on first steps to take, I’d really appreciate it!!

    Thank you!!

    Ableton Start Repeat.

    Success.

  • @SevenSystems said:

    @Telefunky said:
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

    Something like this? 😉

    yes... and I'd certainly quoted your app, but since I'm tied to IOS 9 never had the chance to dig into it. ;)

  • edited August 2019

    @Telefunky said:
    yes... and I'd certainly quoted your app, but since I'm tied to IOS 9 never had the chance to dig into it. ;)

    then try Nanostudio 1 , works on iOS9, there are same control elements... (un) officially they are called "drag handles" :-) That was probably first app using this concept on iOS .. then BM2 (should work in iOS9 too i think) .. Xequence, NS2(obiviously) .. wondering why more apps didn't adopted this thing, it should be golden standard in any DAW on iOS :)

  • edited August 2019

    @SevenSystems said:

    @Telefunky said:
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

    Something like this? 😉

    IMO, the above type of "drag handle" control is the fastest and most efficient way for editing piano-roll notes.

    When I compose using two-phase piano roll melody writing. The first phase is real-time entry by playing just one or more Midi controller keys to lay down notes on the piano roll that act as rhythm, note-length, and note-velocity place-holders.

    In phase two, I go back and can very quickly move each placeholder note up or down in pitch using the drag handles on the right side of the screen.

    I've tried this composing method with other piano-roll editors that use finger dragging, and dragging is just not feasible for this kind of piano-roll composing technique.

    I'm hopeful that one day, a perfected system for two-phase piano roll melody writing might be made available as a part of a new DAW like the OP is discussing, an existing App, or perhaps a stand alone piano-roll Midi composing AUv3 app, built for two-phase piano roll melody writing (hopefully would also include full-featured step-recording capabilities). :smile:

  • I also imagine one could do a 'joystick' that is all in the lower right corner so that up and down could be done with just the thumb. Works for Fortnite.

  • Yes, this kind of control is a fairly classic thing.
    My main point was that it's not always the most effective solution to use an object's location in touch treatment.

    Twisted Wave has an ultra-smart zoom feature (which took me some time to fully figure out):
    dragging up and down exactly on the playhead line will zoom out/in with the playhead position fixed at it's current location. The more your finger's position is left or right from the yellow line, the more the waveform (and playhead position) will move to the opposite direction during the zoom.
    It's a feature that will probably reveals it's full potential only with a fair amount of getting used to this particular movement.
    The display handling and screen refresh is king of all IOS apps. :+1:

    There's a (small) grain of salt in sensitivity of selections to erroneous hits - oops, selection gone. A dedicated 'selection lock' state might save the situation and further allow multiple selections as additional bonus.
    Depending on context this can make things much easier to handle, but takes design effort to not interfere with workflow. Some apps have this mode-lock feature.

  • @Telefunky said:
    There's a (small) grain of salt in sensitivity of selections to erroneous hits - oops, selection gone. A dedicated 'selection lock' state might save the situation and further allow multiple selections as additional bonus.

    That's why Xequence by default uses DOUBLE-tap in an empty area to unselect. It's hard to imagine you'd quickly miss the same note twice in a row 😉

  • But Twisted Wave is an audio editor with no 'free areas', unless you mean the title line... ;)

  • Don’t.

  • @SevenSystems said:

    @Telefunky said:
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

    Something like this? 😉

    I think you've made the best MIDI editor for the touch screen. Nice amount of options that don't get in the way.

    I prefer drawing automation in NS2 but it's slower than X2 for note stuff.

  • @vov said:
    Don’t.

    Hey since you're here. Can you confirm that we are 2 different people?

    @hansjbs seems to think we are the same LOL.

  • @BroCoast said:

    @SevenSystems said:

    @Telefunky said:
    As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
    With the strips always at the same location routine builds up quickly and there's only one point of attention.

    Something like this? 😉

    I think you've made the best MIDI editor for the touch screen. Nice amount of options that don't get in the way.

    I prefer drawing automation in NS2 but it's slower than X2 for note stuff.

    Thanks... yes, editing automation in Xequence is very simplistic, i.e. you directly edit the actual MIDI data, no additional vector layer like in most other editors. Has pros and cons I guess.

  • @BroCoast said:

    @vov said:
    Don’t.

    Hey since you're here. Can you confirm that we are 2 different people?

    @hansjbs seems to think we are the same LOL.

    🤦🏾‍♂️🤦🏾‍♂️🤦🏾‍♂️🤷🏽‍♂️🤷🏽‍♂️🤷🏽‍♂️

  • @hansjbs said:

    @BroCoast said:

    @vov said:
    Don’t.

    Hey since you're here. Can you confirm that we are 2 different people?

    @hansjbs seems to think we are the same LOL.

    🤦🏾‍♂️🤦🏾‍♂️🤦🏾‍♂️🤷🏽‍♂️🤷🏽‍♂️🤷🏽‍♂️

    You could just say, "sorry I was mistaken."

    @winconway said:
    For the record, there was a user at the Beatmaker forum, they where called BroCoast, they made silly claims about corrupt save files after clearly stating that they were force closing the app during saving, that user then asked me to rename them to vov.
    If there is any confusion, it will be because of that, the reason the name stuck in peoples mind is because that particular user went on a fairly entertaining rant.

  • I’ve been building an iOS first (but cross platform) DAW for the last year as a hobby project. I’ve recently started talking to zPlane to license Elastique for time stretching, as the Sampler is a big focus. (I tried to model it after Simpler as much as possible, and yes I plan to make it an AuV3 stand-alone from the DAW).

    However I’m at a crossroads. The project is a labor of love and has been a tremendous amount of work (but it works, mostly!). I’m not sure if it’s worthwhile to try to release it publicly and build along with the community here, or keep it to myself as a fun side project.

    Thoughts? Would folks want to hear more/help steer the development and roadmap if I made it public?

    You can also reach out to me directly if you’re interested in collaborating or chatting in more depth.

  • @Eclipxe said:
    I’ve been building an iOS first (but cross platform) DAW for the last year as a hobby project. I’ve recently started talking to zPlane to license Elastique for time stretching, as the Sampler is a big focus. (I tried to model it after Simpler as much as possible, and yes I plan to make it an AuV3 stand-alone from the DAW).

    However I’m at a crossroads. The project is a labor of love and has been a tremendous amount of work (but it works, mostly!). I’m not sure if it’s worthwhile to try to release it publicly and build along with the community here, or keep it to myself as a fun side project.

    Thoughts? Would folks want to hear more/help steer the development and roadmap if I made it public?

    You can also reach out to me directly if you’re interested in collaborating or chatting in more depth.

    Just my personal opinion, but I’d say put 100% into finishing the auv3 sampler first.

    There’s a much bigger hole in the App Store for a Simpler-sampler than another DAW.

  • @Eclipxe said:
    I’ve been building an iOS first (but cross platform) DAW for the last year as a hobby project. I’ve recently started talking to zPlane to license Elastique for time stretching, as the Sampler is a big focus. (I tried to model it after Simpler as much as possible, and yes I plan to make it an AuV3 stand-alone from the DAW).

    However I’m at a crossroads. The project is a labor of love and has been a tremendous amount of work (but it works, mostly!). I’m not sure if it’s worthwhile to try to release it publicly and build along with the community here, or keep it to myself as a fun side project.

    Thoughts? Would folks want to hear more/help steer the development and roadmap if I made it public?

    You can also reach out to me directly if you’re interested in collaborating or chatting in more depth.

    Haha, I am on the same boat - a month ago I started my first iOS audio app, although now it's only a MIDI plugin but of course, I also dream big and my "ultimate goal" is a separate DAW. AUv3 is awesome, but very limited if you want to control the whole creative workflow. Also I came to the same conclusion that for having a proper sampler, AUv3 is a no-go (mostly for the memory limits).

    Of course, an AUv3 sampler with Elastique in it would be dope, but I would also prefer deep audiobus integration for standalone app where you can have multiple channels / sampler units and multichannel output via AB + proper AB state saving. Currently, there is no timestretching app into which you can just load loops and save with AB project a recall it anytime later. Not to mention multiple outputs....

    Definitely release your app via TestFlight as soon as possible, I bet you get a lot of feedback you'll still spend a lot of time on incorporating it before proper release. But be very strict about feature requests and avoid feature creep, unless you're 100% sure your app would make no sense without it. You can always work on a version 2 in parallel that will contain new features.

Sign In or Register to comment.