Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
i suggest "reverse engineering" an existing app and modify it to your specifications.
just don't forget to totally redesign the gui so no one will be able to guess what app you reconstructed.
Something like this perhaps...
That's certainly interesting... cannot see any obvious problems with that right now, but I'm sick and my brain is operating at about 30% 😏
Yes, that's the proper way of moving hit objects - been around since the iElectribe on iPad One, which additionally used distance from 1st touch to scale parameters of dials.
ps: to study gestures TC Data is probably the most accessible and complete reference.
It's worth the price just for the learning experience
Agreed, just about everything does that for dials and I do have TC Data. I am talking about applying it to note movement though. Do you have any examples of that specificly?
I mentioned TC Data for those who don't have it because it has about any combination of touch analysis and follow up control action.
(number of hit points, initial time and location, movement of poin(s) in distance and angle over time)
It shows the detail of control IOS provides over gestures, neatly sorted in selection lists.
TC Data is about momentary generation of midi events, but one can do an imaginary transformation into different interface domains. At first glance the amount of options is overwhelming and almost confusing.
The movement of notes on a piano roll is a simple example, but shows the complexity:
Is it a single note or multiple notes ?
Are notes supposed to keep their harmonic context, their time context ?
Is movement free or is there a grid alignement ?
What happens if the destination area already contains notes ?
Will those be shifted, replaced, or do the moved notes overlap ?
(no 2 notes can have the exact same position in midi, afaik)
The moved selection of notes may be non continous, which adds complexity when handling the target area. (just a quickshot of ideas without deep meditation)
The drag action as shown in the video requires a lot of space and precise finger movement.
But it could be scaled in a way that it works even on a tiny phone screen.
As an alternative one could simply tap the note(s) and have a control strip area for the up/down movement. Another such strip can provide horizontal movement.
With the strips always at the same location routine builds up quickly and there's only one point of attention.
yes yes yes. this is all very complex already if you want to realize that for a desktop with mouse and keyboard usage.
a touchscreen device opens completely new areas of user interaction. the possibilities seem to be overwhelming at first but finally will trace down to some very few, very well thought paradigms. something that is shared by all apps.
The first deep impact experience that the touchscreen usage brought, was pinch & zoom with the browser. that was cool! you could actually zoom into the area of interest!! revolutionary! unfortunately this cool thing has never taken its way into most apps (despite some graphics tools).
but the current reality is completely another, that is, the static desktop versions just are ported nearly 1:1 onto the iOS platform. especially with many audio units. the popups and menus just were translated. these popups, b.t.w., locking the entire screen with all the controls, are another awfulness on iOS. I would forbid these if I ever could.
but these minified static interfaces are not even made zoom-able, so that nobody actually can read the labels or actually hit these controls correctly.
.. but more... the iOS users seem to be “trained” in the meantime, not even trying to zoom into, because they did that too often with zero results ..
i have seen users claiming that everything is too small and unusable, because they did not even try to make a pinch gesture, which would have solved their problem.
Something like this? 😉
Ableton Start Repeat.
Success.
yes... and I'd certainly quoted your app, but since I'm tied to IOS 9 never had the chance to dig into it.
then try Nanostudio 1 , works on iOS9, there are same control elements... (un) officially they are called "drag handles" :-) That was probably first app using this concept on iOS .. then BM2 (should work in iOS9 too i think) .. Xequence, NS2(obiviously) .. wondering why more apps didn't adopted this thing, it should be golden standard in any DAW on iOS
IMO, the above type of "drag handle" control is the fastest and most efficient way for editing piano-roll notes.
When I compose using two-phase piano roll melody writing. The first phase is real-time entry by playing just one or more Midi controller keys to lay down notes on the piano roll that act as rhythm, note-length, and note-velocity place-holders.
In phase two, I go back and can very quickly move each placeholder note up or down in pitch using the drag handles on the right side of the screen.
I've tried this composing method with other piano-roll editors that use finger dragging, and dragging is just not feasible for this kind of piano-roll composing technique.
I'm hopeful that one day, a perfected system for two-phase piano roll melody writing might be made available as a part of a new DAW like the OP is discussing, an existing App, or perhaps a stand alone piano-roll Midi composing AUv3 app, built for two-phase piano roll melody writing (hopefully would also include full-featured step-recording capabilities).
I also imagine one could do a 'joystick' that is all in the lower right corner so that up and down could be done with just the thumb. Works for Fortnite.
Yes, this kind of control is a fairly classic thing.
My main point was that it's not always the most effective solution to use an object's location in touch treatment.
Twisted Wave has an ultra-smart zoom feature (which took me some time to fully figure out):
dragging up and down exactly on the playhead line will zoom out/in with the playhead position fixed at it's current location. The more your finger's position is left or right from the yellow line, the more the waveform (and playhead position) will move to the opposite direction during the zoom.
It's a feature that will probably reveals it's full potential only with a fair amount of getting used to this particular movement.
The display handling and screen refresh is king of all IOS apps.
There's a (small) grain of salt in sensitivity of selections to erroneous hits - oops, selection gone. A dedicated 'selection lock' state might save the situation and further allow multiple selections as additional bonus.
Depending on context this can make things much easier to handle, but takes design effort to not interfere with workflow. Some apps have this mode-lock feature.
That's why Xequence by default uses DOUBLE-tap in an empty area to unselect. It's hard to imagine you'd quickly miss the same note twice in a row 😉
But Twisted Wave is an audio editor with no 'free areas', unless you mean the title line...
Don’t.
I think you've made the best MIDI editor for the touch screen. Nice amount of options that don't get in the way.
I prefer drawing automation in NS2 but it's slower than X2 for note stuff.
Hey since you're here. Can you confirm that we are 2 different people?
@hansjbs seems to think we are the same LOL.
Thanks... yes, editing automation in Xequence is very simplistic, i.e. you directly edit the actual MIDI data, no additional vector layer like in most other editors. Has pros and cons I guess.
🤦🏾♂️🤦🏾♂️🤦🏾♂️🤷🏽♂️🤷🏽♂️🤷🏽♂️
You could just say, "sorry I was mistaken."
I’ve been building an iOS first (but cross platform) DAW for the last year as a hobby project. I’ve recently started talking to zPlane to license Elastique for time stretching, as the Sampler is a big focus. (I tried to model it after Simpler as much as possible, and yes I plan to make it an AuV3 stand-alone from the DAW).
However I’m at a crossroads. The project is a labor of love and has been a tremendous amount of work (but it works, mostly!). I’m not sure if it’s worthwhile to try to release it publicly and build along with the community here, or keep it to myself as a fun side project.
Thoughts? Would folks want to hear more/help steer the development and roadmap if I made it public?
You can also reach out to me directly if you’re interested in collaborating or chatting in more depth.
Just my personal opinion, but I’d say put 100% into finishing the auv3 sampler first.
There’s a much bigger hole in the App Store for a Simpler-sampler than another DAW.
Haha, I am on the same boat - a month ago I started my first iOS audio app, although now it's only a MIDI plugin but of course, I also dream big and my "ultimate goal" is a separate DAW. AUv3 is awesome, but very limited if you want to control the whole creative workflow. Also I came to the same conclusion that for having a proper sampler, AUv3 is a no-go (mostly for the memory limits).
Of course, an AUv3 sampler with Elastique in it would be dope, but I would also prefer deep audiobus integration for standalone app where you can have multiple channels / sampler units and multichannel output via AB + proper AB state saving. Currently, there is no timestretching app into which you can just load loops and save with AB project a recall it anytime later. Not to mention multiple outputs....
Definitely release your app via TestFlight as soon as possible, I bet you get a lot of feedback you'll still spend a lot of time on incorporating it before proper release. But be very strict about feature requests and avoid feature creep, unless you're 100% sure your app would make no sense without it. You can always work on a version 2 in parallel that will contain new features.
Hey @klownshed and @Eclipxe ! Read the thread, super interested in your ideas, working on something similar myself. Would y'all be open to chat about it? Don't hesitate to reach out!
If you’re gonna commission someone, can you commission the nanostudio chap to add audio tracks and AU automation? (Or whatever the other missing feature was). That’s the best ‘traditional’ daw on iOS. Better yet commission him and a dev team. Whatever the cost, it’ll be far less than paying someone to build a daw from the ground up!
I was going to mention Xequence2. Recently got it after having read about multiple times. That arrow-handle implementation for moving notes is just spot-on, fantastic. Best I’ve seen for touchscreen.
Agreed! Nanostudio 2 has very similar handles.