Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
People who make music on iOS and finalise your mix on desktop, what's your workflow?
Do you just use iOS to come up with a general idea/chord progression/melody etc. and use desktop to arrange/finalise your song, using your desktop DAW's plug-ins and effects?
Do you export your stems and mix on desktop?
Do you export a mixdown and master on desktop?
I have got Logic on my Mac but admittedly I haven't used it much, since I find my iPad much more convenient to make music on the go and even at home, but I'd like to start using Logic to mix/master my songs, so I'm curious about how other more expert people's workflow look like.
Comments
Certainly not claiming to be an expert. My current workflow is based around my working towards playing live (see video in a thread about recording external stuff) my setup is something like iPad/Komplete audio 6/Norns/External synths - and I've got that setup on a little table and running into Logic via my ADAT expander on my Focusrite via a couple of long guitar leads. Which, at first glance, might seem kind of silly - going out one sound card into another - HOWEVER it turns out to be a really nice workflow that focuses on the music and not the tech. I sit at my little setup and play about. When I'm ready to play I get up, go over to my desk press record in logic, then sit back at the live setup and play. Getting mixing and levels right has to be in the iPad (in AUM) because it's aiming for playing live so when it comes to final mix - bit of compression, bit of overall EQ, and job done.
Context: My iOS usage is usually sequencing rhythm parts and then we (my duo) play guitars, synths, and sing live over top of that. Though I could of course sequence the other synth leads, pads, etc., I like to play the main standout parts. I use a lot of velocity and like tweak the knobs for expression live. The guitar and singing are harder to sequence.
But to answer your questions, I would break it down by individual track.
Tracks that are individual samples, usually drums & percussion, I have a copy of these on the desktop with matched notes in a sample player there, so I just copy the samples and midi over. A quick step that allows finer editing sometimes making small changes I didn't anticipate until I could hear the full mix.
The rhythm parts...
...App simple drums: I use the Ruismakers a lot. Once the song is where we are happy with it. I sample the Ruismaker kit from Ruismaker OG and FM. The main velocity parameter there is simply amplitude so that is an easy transfer the midi and samples over. That is just an easier way to have each Ruismaker drum on an individual track verses repeated muting the others mix downs.
App other rhythm sounds:
Ruismaker Noir is my other main drum source. It is a whole dynamic and pitch controlled thing with lots of subtle variations so that would be slow to sample. So I do a mixdown of that, send the wav over. If I want any different effects on the low or high bits there, just use multiband sends.
Other synths for often bass lines, just do a mix down.
Doubling up
With all the rhythm parts, I copy the midi over just in case I feel like doubling anything with Tremor or replacing anything with a hardware synth.
**The lead etc. parts...
After the above is recorded, simply record these live. May substitute other hardware synths not part of my portable setup or supplement with other iOS synths that are CPU hungry.
I also am running out of one Focusrite to another for the iOS live sounds. Direct from the hardware to the Focusrite 18i20 or into a tube amp and then micd.
my browser is acting up wth
error duplicate
@Multicellular Are you recording Ruismaker sounds individually by muting and then sampling the sequence sound by sound?
To elaborate
1. sample Ruismaker OG and FM by making a temporary loop at the end of the song and putting ascending notes 1 per bar or longer if any have a long decay.
2. Mix down just that bar.
3. Export that wav to Twisted Wave to make 1 sample per drum. I find Twisted Wave as easy actually as any desktop editor for simple precision sample editing. I name them as the midi note numbers.
4. Send those samples over to desktop put them in the song folder.
5. Load the samples in the DAW according to the note names
6. Add the exported midi.
7. Good to go
I export track by track (one reason why I used gadget) into mixbus 32c on my PC and then went to work. Sometimes I'd also export midi for a track or section to see if I could replace the sound with another instrument.
I collect material on iOS to build tracks with on desktop.
My workflow used to be:
Record to Revox A77 with some kind of drum machine/click. Then I'd print that to Ableton and build a track from there.
I do the same thing with iOS replacing the tape machine. It sounds kind of dumb but I don't play like myself if I'm recording "takes" to a DAW.
Thanks for explaining.
Exact reciprocal ....
Hardware and Mac finalize on iPad....
I create most if not all of the track on iOS then create individual stems for each channel or groups depending on the project which I then import as individual audio files into my desktop daw for final processing.
With that said, I do also import midi tracks if I’d like a lead, bass or pad to be adjusted, additionally layered or replaced with an instrument on my desktop.