Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Advice for designing an iOS DAW

Wondering if anyone here might be able to provide some advice for designing my own iOS DAW. I’ve never designed an app, don’t know coding, etc... So obviously a lot to learn. I’m considering commissioning someone, too. Just know that I have very specific ideas I’m looking for in an iOS DAW, and have grown frustrated waiting for other iOS DAWs to incorporate what I need.

I see iOS as the future of computer music, and am continually surprised how few options there are which fully bring together all of these forward-thinking developments... which to me is the whole point of all of this, incorporate them all into one app, so we spend less time thinking about how to make everything work together and more time making great new music.

Aside from the obvious DAW staples, here’s a list of what this new DAW must incorporate:

• MIDI and Audio
• Side-chaining
• MPE integration
• Bluetooth MIDI
• MIDI FX
• Auv3
• Integrated drum programmer
• Integrated arpeggiator

And at every level, should be designed with a touch interface in mind (you should be able to touch anything you see and manipulate right there, without having to dive into peripheral menus). And very much visually based (Icons, color hierarchy, etc.) and text only where it’s essential. It should be simple on the surface, but deep when you start diving in. It’s fundamental functions should be absolutely intuitive without having to read a manual/watch a video, etc.

There are lot of great programs out there, and I’ve spent lots of time with each. But for my personal needs, this is my dream of the perfect iOS DAW. Frustrated in waiting for it to arrive, I’ve decided to try to make it happen myself. So......... if anyone has any advice on first steps to take, I’d really appreciate it!!

Thank you!!

«13

Comments

  • Take a close look at Ableton Simpler (The built-in Sampler) and 'create' an AUv3 version with similar features.
    I'm pretty sure it would sell like hotcakes since 'everyone' is still looking for a flexible easy-to-use AUv3 sampler :)

  • edited August 2019

    @tadat said:
    I’ve never designed an app, don’t know coding, etc...

    Seems like mission impossible :-)

    One of the most important features of a DAW is stability and rock solid timing.. Just use the iOS for simple stuff, sketches. Do the real work on a laptop...

  • I’ll take a look, Samu... thanks. Although, I’m honestly less interested in incorporating individual instruments, fx, sounds, etc.... The idea here is more to create a house where all of these modular pieces can come together and work together. There is so much innovation going on with individual app instruments and fx, and that is what is so exciting about composing on iOS. What seems to missing, however, is one app designed to seamlessly, and simply, host all of these modular pieces.

  • @tadat said:
    I’ll take a look, Samu... thanks. Although, I’m honestly less interested in incorporating individual instruments, fx, sounds, etc.... The idea here is more to create a house where all of these modular pieces can come together and work together. There is so much innovation going on with individual app instruments and fx, and that is what is so exciting about composing on iOS. What seems to missing, however, is one app designed to seamlessly, and simply, host all of these modular pieces.

    Understood, but think about it. What would one want to do with recorded audio in a DAW?
    Transpose it? Stretch It? Filter It? Modify Clip Gain During Playback for fade In/Out?
    (Those are just some of the features in Ableton Simpler and all features can also be applied on audio clips on the timeline).

    Look at for example GarageBand, It's possible to drag almost any session audio to its sampler and treat it as an instrument :)

    The most basic instruments in a modern DAW is the sampler.

  • IOS actually integrates an audio unit sampler.
    It is basically the same code base as the Mac OS sampler.

    The strange thing with that is, that Apple did not make it any useful nor is there a possibility to inherit new audio units from it. A user interface to access the sampler on iOS is completely missing. The sampler can be instantiated in an app but not recompiled as a new audio unit.

    So the deal is, that someone must at first code a complete new sampler audio unit.

    We created some code, but the project is frozen, because of fundamental problems with sharing audio files for the sampler between instances and hosts.

  • edited August 2019

    And by the way: creating an entire DAW for IOS from the scratch is a work of approximately one year full time job for one developer. Better a team of developers... Those developers must be experienced experts in audio programming already and have a fundamental knowledge of Apple programming languages (multiple requred!).

    I think there is definitely no chance for a non-coder to achieve this ever.

    It is not so much the case, that developers are too lazy to develop useful DAWs. But it is an expert job for absolute first class developers, and the sight of the await-able compensation probably will never even attract such a person even considering to do that.

  • @JG_digister_com said:
    IOS actually integrates an audio unit sampler.
    It is basically the same code base as the Mac OS sampler.

    True, but it doesn't have any UI on iOS so I guess that's one reason many of the built-in AUv3s are hidden.
    For example the Dynamics Processor is pretty sweet (It can be used in GarageBand).

    So the deal is, that someone must at first code a complete new sampler audio unit.

    That is what some developers have been doing, with AudioLayer, ELSA and apeMatrix Sampler AUv3 to name but a few.

    We created some code, but the project is frozen, because of fundamental problems with sharing audio files for the sampler between instances and hosts.

    Sandboxing creates problems for sure. But as long as the apps support Files.app/DocumentPicker or Support Drag'n'Drop its pretty easy to get to the samples and this 'procedure' is somewhat accepted among iOS users.

    Once the samples are in the AUv3's storage container they can be accessed from all instances regardless of the host.

    Allowing direct Drag'n'Drop from a host to an AUv3 is something Apple has to fix...
    (Dragging from Files.app to an AUv3 window already works so it can be done).

    So yeah, it's not an easy task but as long as the AUv3 can access files outside of it's own space using Files.app etc. it's a workable solution.

  • I feel like there is a reason the "perfect" iOS daw doesn't exist yet as there are many talented devs creating for iOS.
    I don't know if they intentionally leave out certain features, or if it's a problem with how Apple works, but there are some very close apps that if we could somehow magically combine we would be there, as almost every DAW for iOS is just a couple features short, but you'll find those missing features on another DAW that is missing features that the other daw has...haha it's kind a strange and I wonder why..

    The best imo are stagelight, ns2 and cubasis
    I lean toward stagelight because of two reasons...
    Clip sequencing and audio tracks.
    What it's missing is the ability to use midi effects, I want to use these amazing iOS sequencers to create clips that I can launch freely in any order... So for now I use stagelight mostly with imported audio that I've recorded elsewhere on iOS.

    If you actually are going to figure out how to build a DAW make sure to have audio tracks, ability to use midi effects, and clip sequencing

  • The audio unit sampler is excessively used in GarageBand b.t.w.
    But yes, only Apple has the ability to access the code this way...

    The audio unit sampler is interesting for the reason of being able to share it”s aupreset format with the Mac OS.

    So an IAA app with the audio unit sampler should be easily possible and I am wondering, why nobody has done this yet.
    The aupreset format is plain XML, easy to read and easy to understand. It is even possible to compile new presets on the fly based on some actions with a user interface...

    Although this sampler is very picky of where the samples have to resist and also very error-affine to XML syntax errors and has some internal configuration issues. It is far of being perfect, but at least has the inbuilt feature of realtime disk streaming.

    Disk streaming is an requirement for any iOS sampler, as the memory is quite limited on mobile devices.

  • edited August 2019

    DAW is too big task, especially if you have no experience. Work for many years fulltime job...

    i'll suggest you to start with something simpler, like AU plugin .. to learn basics, understand DSP coding, get overview about caveats which are waiting on you in iOS audio aps coding.

    Full featured sampler is something which is missing on iOS and many people would appreciate to have sampler like the one which is part of BM3 as standalone AU plugin. Even that is job for year at least, probably more.

    Whole DAW is monstrous task even v for experienced coder, simply forget it for now.

    In general - DSP coding is probably most complicated thing to code, it's even lot more complicated than machine learning /AI. Which is sometimes pretty mind bending - but still nothing compared to DSP coding. Be ready for C/C++ and even ARM assembler if you want write really efficient code (definitely forget Swift).

  • Sorry to be discouraging, but this sounds like an insane proposition.

    I look at BM3 (third try?) and NS2 (second try?) and see that even these code wizards have difficulty with this task of creating the greatest iOS DAW, even with years of experience.

    But, rest assured, I too know little about it, so maybe you can do it.
    That would be an incredible accomplishment.
    And I would get the perfect iOS DAW, finally!
    B)

  • I have fun drawing imaginary UIs just to get them out of my system.

  • @tadat said:
    .. It should be simple on the surface, but deep when you start diving in. It’s fundamental functions should be absolutely intuitive without having to read a manual/watch a video, etc.

    There are lot of great programs out there, and I’ve spent lots of time with each. But for my personal needs, this is my dream of the perfect iOS DAW. Frustrated in waiting for it to arrive, I’ve decided to try to make it happen myself. So......... if anyone has any advice on first steps to take, I’d really appreciate it!!

    How about considering yourself not as a static person, but as one with growing skills ?
    Things may not reveal at first glance, but if they are designed in a clear and structured way they will in less time than expected. Learn to learn. ;)

    I admit it's not exactly easy, as strategy and structure are the most missing elements in IOS DAW design and many cooperting apps, too.
    (as mentioned above: if you could melt half a dozen apps into one single, most problems would be solved)

    My main DAW on desktop was far from unintuitive in the beginning, but now I'm one of those who operate the thing like greased lightning (as someone once called some Nashville sound engineers doing their job on it).
    When I opened Pro Tools TDM the first time, I thought: omg, that's the industry standard ???. It took less than a month to get aquainted to it and recognize the cool parts.

    If you design something, that's immediately obvious for any beginner, the parts making it easy will turn to obstacles after a rather short time.
    I remember typesetters (in early days of desktop publishing) who ansered sequences of 3 diagog boxes in Quark XPress with keystrokes so fast that I couldn't even read those dialogs' content. Just a sequence of popping up and vanishing frames - a routine job.

    I takes effort to sort out those apps that provide the functionality you need and fit your style of work, just don't give up too early.

  • @Telefunky said:
    My main DAW on desktop was far from unintuitive in the beginning, but now I'm one of those who operate the thing like greased lightning...

    sorry, should read far from intuitive

  • One of the most experienced and highly respected iOS audio developers has spent 5 years developing a DAW that doesn't even have half of the stuff you mention, and that guy already knew absolutely everything necessary and already had lots of bits and pieces floating around because he had already designed ANOTHER DAW-like "thing" before that one 😎 just trying to give some perspective here...

    (some personal perspective: it took me (25 years of software development experience) half a year to develop an average MIDI sequencer alone (arranger, pianoroll / controller editor, keyboard with scales support). That doesn't include MPE, MIDI FX or anything like that, and I don't even think that was particularly slow).

  • The problem with this is that when you focus on iOS only, you lock yourself into Apple, their "platform" mentality, and whatever whim they have, which might be to outright copy your software, release it as a free update to their OS, and leave you in financial free-fall if you didn't have enough cash for your investment like most people do not, and took out a loan to make ends meet. I am not that person, but it has happened to many in Apple's orbit.

    If you want to do this, try to make something work on open "platforms", while designing it to be functional on iOS as well, so you at least have an escape hatch.

    I do not seriously expect anyone to make the high investments necessary to make such software in the face of Apple's platform control, and their gigantic cut of the pie in comparison to the risks they bring. I don't see this happening until financial pressure forces Apple to open at least its iPadOS "platform".

  • @SevenSystems said:
    One of the most experienced and highly respected iOS audio developers has spent 5 years developing a DAW that doesn't even have half of the stuff you mention, and that guy already knew absolutely everything necessary and already had lots of bits and pieces floating around because he had already designed ANOTHER DAW-like "thing" before that one 😎 just trying to give some perspective here...

    (some personal perspective: it took me (25 years of software development experience) half a year to develop an average MIDI sequencer alone (arranger, pianoroll / controller editor, keyboard with scales support). That doesn't include MPE, MIDI FX or anything like that, and I don't even think that was particularly slow).

    Aren’t you also working on a DAW?
    If so, it is bound to be amazing.

  • The other thing is, that the user interface paradigms of a desktop computer cannot simply be ported to the iOS platform as so often seen, actually see Cubasis. ^^ It just does not work optimal and the result will be an awful user experience, nothing else.

    UIX design is a damn complicated task with mobile devices. Usually you design 10 trials and have (maybe) one that actually works.

    Also the users are quite different. The ideal DAW for one person is not automatically the ideal DAW for another.

    My concerns with many DAWs and Audio Units on iOS are the static mini controls and images, that are so damn small and nothing can make these bigger, even if pinch&zoom would be quite easy to implement. It”s just the old mouse editing paradigm, that does not work on mobile devices.

    I”ve not seen a single MIDI sequencer yet on iOS, that actually really tries to adopt a new and totally intuitive touch screen behavior. Everything traces down to classic grid based editing and the grids are fixed in size. The Cubasis zoom feature in all editors is just awful for instance and nobody actually seems to even recognize this.

  • @CracklePot said:

    @SevenSystems said:
    One of the most experienced and highly respected iOS audio developers has spent 5 years developing a DAW that doesn't even have half of the stuff you mention, and that guy already knew absolutely everything necessary and already had lots of bits and pieces floating around because he had already designed ANOTHER DAW-like "thing" before that one 😎 just trying to give some perspective here...

    (some personal perspective: it took me (25 years of software development experience) half a year to develop an average MIDI sequencer alone (arranger, pianoroll / controller editor, keyboard with scales support). That doesn't include MPE, MIDI FX or anything like that, and I don't even think that was particularly slow).

    Aren’t you also working on a DAW?
    If so, it is bound to be amazing.

    Yes, it is already amazing of course 😉. But realistically, it will take much more time to get it into a releasable state. But it's huge fun already to produce in 🙂

  • @tadat As far as what you want, Apple has the best chance of actually accomplishing it. Admittedly, I haven't been using the iPad for music making for very long, but the closest I'm aware of any one person achieving his/her vision within the "established" and "traditional" touch-based iOS convention is Matt Borstel with NanoStudio 2. For further inspiration, someone else whose DAW is available on iOS who may be closer to achieving his vision albeit not within the iOS design you seem to have in mind is Alexander Zolotov who created SunVox. I think SunVox was ported to iOS. It is touch based, but uses a scheme of its own. Maybe your vision is like some combination of these two approaches. There are many experimental music apps that may hint toward what you want in a DAW. Probably the best place to start before learning the programming end of it or hiring someone and hoping they could produce what you want would be to do the design yourself. Get whatever you have in mind down in some format and ensure it will logically work as the front end for whatever technical magic needs to happen on the back end to intuitively produce music. The iOS music community, developer and end-user alike, seems to be very open to new ideas. Try to network with others and pitch your design. You might be able to build a team to create your dream. Be pragmatic about it though. There are certain realities to the software business that you will need to anticipate--Contracts, copyrights, licensing (as required), submitting your DAW to Apple's App Store, etc. Find experienced professionals to help with all these things so you can focus on the design. You will need a design first because without it, all you will have is a marketing description of what you want.

  • @JG_digister_com said:
    The other thing is, that the user interface paradigms of a desktop computer cannot simply be ported to the iOS platform as so often seen, actually see Cubasis. ^^ It just does not work optimal and the result will be an awful user experience, nothing else.

    My concerns with many DAWs and Audio Units on iOS are the static mini controls and images, that are so damn small and nothing can make these bigger, even if pinch&zoom would be quite easy to implement. It”s just the old mouse editing paradigm, that does not work on mobile devices.

    I”ve not seen a single MIDI sequencer yet on iOS, that actually really tries to adopt a new and totally intuitive touch screen behavior. Everything traces down to classic grid based editing and the grids are fixed in size. The Cubasis zoom feature in all editors is just awful for instance and nobody actually seems to even recognize this.

    Yes, this exactly! The mouse editing paradigm should be abandoned altogether and the interface should be reevaluated through the lens of what this format allows that a desktop computer does not: a tangible relationship to the images... which is especially vital in music. MPE is mind-blowing in what it has achieved in allowing deep, dynamic expression from an otherwise static physical interface. Something similar can be done with the visual interface of a DAW. For example: In a piano roll view, you should never have to look away from that note to modify it. You just touch the note and most modifications could be achieved by your movements while holding the note: drag up/down/left/right to move the note. But touch the note with two fingers and drag up to increase velocity, down to decrease velocity, drag right or left to extend the notes length. With two fingers, tap and hold the note and WARP into the next sub layer of that note’s relative information that the visual note represents.

    WARPING into and out of screens is the visual philosophy of this DAW (like moving into and out of a wormhole). Whereas the traditional mouse-based approach is to click on an image to bring up menus, the idea here is to instead TOUCH whatever you want to adjust, and depending on how many fingers or for how long, you’ll be able to manipulate whatever it is that image represents... a more tangible experience. And rather than the lateral or “stacked” approach of traditional DAWs and their various screens, this will take you INTO and OUT OF various screens/modes. IE: in arranger mode, two-finger long touch on a MIDI container and that will WARP you into it’s relative piano roll/drum sequencer. Yes, this is already somewhat common, but here, you continue that same idea, two-finger long touch a note and that will warp you into the next deeper level. Another example... The Mixer page should be extremely minimal, showing on that page only what is essential (what an actual human mixing the sound would be using while live mixing) Volume sliders, Audio levels, pan pots, FX send knobs, mute, etc. But if you two-finger long-touch on a volume slider, that would warp you into a new page where you can manipulate all parameters related to that track’s volume and output: graph the track’s volume/automation, add Audio FX, EQ the audio, etc. Then, in the audio’s visual graph, if there’s a point you want to adjust, tap it using the same tapping functions established elsewhere... two-finger long tap it to WARP into an deeper layer of variables relating to that specific point. Then when you want to step back out to the previous screen you just double tap with two fingers. And if you keep double-tapping with two fingers, anywhere on the screen, no matter where you are, you’ll eventually end up back at the primary screen/layer which is the arranger.

    The idea is to minimize each screen so that the musician is only seeing what they might use at that stage of the process. Then, if they want to go deeper into a particular musical phrase/loop they just, without looking away from the object of interest, two-finger long hold on that object to go deeper into it’s sound elements. WARPING into and out of the song, always visually focused on the specific object of interest.

    This is the possibility of how a touch interface can improve the traditionally static relationship between the musician and device/instrument, by focusing on a more tangible relationship between the visual information and the process of music creation.

  • @tadat : these are great ideas. I agree that too many apps haven’t embraced the new paradigms that touch interface make available. Part of the problem is that starting from scratch is so slow-going that for large-scale projects, devs largely cut corners by starting from something familiar so that they can jump right into it.

    Bear in mind that implementing software like this takes a lot of time. It will be a project of years for a knowledgeable programmer. This isn’t meant to discourage you. An app such as you describe would be brilliant. Sometimes people don’t realize the scale of projects such as this. The real-time aspect of audio-processing (particularly with the restrictions of mobile devices) requires some special skills above and beyond just being able to code...and the pool of people good at it is small which makes it tricky to find a programmer-for-hire that is affordable.

  • edited August 2019

    @tadat that's the stuff that my nightmares are made of and that I desperately try to avoid in my interfaces. 😬

    Moving small objects like notes directly with your finger doesn't work very well because your finger covers the very object you're trying to move.

    And I know all these "touch and hold with 3 fingers while making a circular up and down motion with your other two fingers" are totally en vogue nowadays (iOS adopts them more and more too, unfortunately), but it's a huge discoverability problem, and also a problem for memorization. Nothing about the object you see implies that you can touch and drag it circularly with 6 fingers.

    For me, every single feature should be represented by a separate user interface element, i.e. button, slider, handle etc.. The most used stuff goes in the first hierarchy level, the lesser used in the first submenu level, etc... i.e. as developed at Xerox roughly 250 years ago 😁

    (yes I'm old!).

    (and of course this doesn't apply to absolutely universal gestures like zooming or scrolling. But that's roughly where "universality" ends for me.)

  • Moving small objects like notes directly with your finger doesn't work because your finger covers the very object you're trying to move.

    >

    Of course, this is incredibly frustrating when I see this in app design. However, for the language of touch interface to work properly with the human brain, there needs to be a visual response to that object to signify we’ve made proper contact. As such, whatever it is you’re touching will have a visual response, and objects likely to be obscured by the finger would enlarge appropriately. The whole point is simplicity and clarity. If the workflow impedes that primary function, find a way to visually adapt. :)

  • I think the idea of having a DAW that is designed from the ground up to be used for a visual touch screen environment is a good one. Developing consensus around what that might be seems challenging. Hiring someone to implement these ideas seems limited to someone with extremely deep pockets. As others have pointed out, very few developers would be able to implement such a design as an app.

    Perhaps coming up with good mockups might be your most effective strategy for helping your ideas to get some traction. If there’s enough enthusiasm for them, there could also be a path created for acquiring the resources needed to make them a reality.

  • @tadat said:

    Moving small objects like notes directly with your finger doesn't work because your finger covers the very object you're trying to move.

    >

    Of course, this is incredibly frustrating when I see this in app design. However, for the language of touch interface to work properly with the human brain, there needs to be a visual response to that object to signify we’ve made proper contact. As such, whatever it is you’re touching will have a visual response, and objects likely to be obscured by the finger would enlarge appropriately. The whole point is simplicity and clarity. If the workflow impedes that primary function, find a way to visually adapt. :)

    Excellent point, you see the video games industry come up with excellent simple visual and audio designs to make touch interface work! Simple bleeps and enlarged graphics on touch can help. I encourage your thoughts and like others have said you will probably need to spend alot of time learning coding before you could even start such a project. However don't let this put you off, even developers might not have the right approach to some of the issues an ios DAW present.

  • edited August 2019

    @SevenSystems said:
    @tadat that's the stuff that my nightmares are made of and that I desperately try to avoid in my interfaces. 😬

    Moving small objects like notes directly with your finger doesn't work very well because your finger covers the very object you're trying to move.

    One could make it so that the first part of the ‘move’ (a tiny number of transparent milliseconds) simply offsets the finger from the object to be moved. Then moving begins to take effect with the finger now separate from the note. The brain locks on to the note as what is being moved and the finger is merely the controller. I did a mockup (with a programmers help) of this in Unity once with mouse control and bet it translates even better to touch. Would love for someone (cough, hello :) ) to try it.

  • i think simply drag handles, re-invented for iOS by Mr.Borstel in NS1 are simply best possible way to move things on touchscreen...

  • @dendy said:
    i think simply drag handles, re-invented for iOS by Mr.Borstel in NS1 are simply best possible way to move things on touchscreen...

    Currently. But I am sure new untapped solutions are waiting to be, uhhh, tapped.

  • True, you cannot know that something is better until it is not invented :-)

    Btw. regarding knobs - totally best UI for knobs is implemented in Thor.

Sign In or Register to comment.