Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Orchestral music on iPad

124»

Comments

  • edited February 2017

    So, e.g. for Chris Hein Solo Violin. Check this out. Note esp. what's said around 1:28 - in terms of live vs. MIDI rendered. Then notice all the articulation possibilities around 4:01 &ff and then 6:19 &ff etc. etc. Stunning. On my "maybe" list - when I can justify the time perhaps to cost / learning across the CH range. :smiley: ($179 just for this one instrument (solo violin))! Thankful for the libraries I already do have - still learning those!

  • @MusicInclusive said:

    @ipadmussic said:

    Just wondering with our current technology is this so hard achieve - translate music from a notation s/w & record into DAW using sample libraries preserving all the articulations?
    May be there is no market for this. I guess either folks directly record into DAW using real instruments/sample libraries or folks like you "write" using notation s/w.

    I belong to the irrelevant market here :smile: ...no musical theory knowledge or no instrument skills. Well for now i will keep trying using iOS platform..and may be someday upgrade myself to the above group.

    Well. Hmm. I think - once Dorico moves from pre-alpha to a product that should actually be on sale (IMO it was placed on sale while still in pre-alpha stage... and it's finally moving into beta with all the feedback from the people who bought it - it's a great product, or it will be when it's finished (Kudos to the devs) but it seems to me that Steinberg's project management went a bit odd there... - anyhow) then it might provide for some more in-program tweaking with the DAW-like facilities that will allow people to do more of that notation->realistic rendering directly in a notation program. But, really, it isn't finished yet.

    Desktop Notion's probably the easiest to use directly to do this currently in some fashion that produces and immediately pleasing rendered result, but people have long worked directly in DAWs like Cubase - from early Cubase days - to enter notes by hand into MIDI tracks to instrument with. Google "Cubase expression maps" and you'll see what people have worked up.

    A lot of people do that - yes - with either the traditional libraries like EastWest and VSL or they have worked with the wonderfully sculpted but very complex newer libraries in Kontakt like the Albion range from Spitfire, Chris Hein, Embertone and so on. Those provide for a great deal of expression variation but are more often played using a keyboard vs. scored in, say, Notion and then rendered automatically. Or, if the latter is done with custom mappings, then further work is still done with expression in a DAW after that.

    I highly recommend watching the latest Chris Hein library videos to get a feel for the depth of realism that can be achieved - esp. where it shows the library emulating the recorded violin etc. BUT, know that a significant amount of manual work went in to achieving that. Take and mulitply by a symphony's worth of instruments and bars!!! Lot of work.

    Have you noticed that a lot of computer-instrumented film music these days is staccato? And if there are lush strings they're usually not long solo passages? (Or if they are solo, they're recorded live with a real violinist). There's a reason for that :smile: - it's much easier to handle staccato and staccato-like articulations; the ear is more forgiving than for emotionally expressive solo instruments. Can it be done? Sure, but it takes a lot of work and doing so for a longer passage is still a lot of work.

    That being said a lot can still be done pretty well - even with things like Notion on iOS.

    Thanks. checked out Dorico & Chris Hein Violins. amazing stuff! especially the violin library was mind blowing.

    I hope those expression maps would make it to Cubasis someday :smile:

  • edited February 2017

    @ipadmussic said:

    I hope those expression maps would make it to Cubasis someday :smile:

    That would be good. However... There needs to be something to hook it into! The way expression maps work is by keyswitches, and we don't have any comprehensive keyswitching apps on iOS. SampleTank does some keyswitching for some instruments, but not the orchestral ones. Can be set up to use separate MIDI channels for each articulation like iSymphonic - and then only 8 and then only offers a single stereo out. So, again, one can't do expression maps (in a single MIDI channel) because the instrument isn't being triggered that way.

    But, maybe, one day. (Along with 16GB RAM and 1TB storage :wink: )

  • @MusicInclusive since you are here and expounding on Notion, can I ask you about something I've been struggling with in my last few days of Notioning? (Mostly iPhone 6+, sometimes on iPad). Is there a way to articulate -- and have reflected in the playback -- a soft dying-off of a note? I have a clarinet dotted half note where I need the ending to not be so damn distinct, and applying a diminuendo results in a very false sounding 'someone pulling down the volume slider' effect. I tried a hairpin, but that was also too radical a decline, and if I shortened the hairpin length it basically did nothing. What I am after is like what I would get on a sampler synth by increasing the Envelope Release for that note only. "Let Ring" doesn't seem to do anything for wind instruments. Any tips?

  • @JonLewis You only have available to you what's available for regular scoring - so, the dim... is what I'd use for that. If it's not sounding how you want, then it's hard to know what to do, because basic dynamics control is normally achieved like that and if the underlying sample doesn't provide any further expressivity, you're stuck. I tried this on the desktop too, and the best I could achieve with the built-in LSO samples was to also add a strong accent with tenuto. In iOS, that leads to a messy looking score with articulations one would probably not write if scoring for an actual player. However, in desktop Notion, one can add those for rendering, then hide them from view. This is one way that the sample scores in desktop Notion achieve a much more realistic sounding rendering. But, yes, for a solo clarinet part with a sustained note. Meh... ! :neutral:

    E.g.

    However, I also tried with the EWQL library in desktop Notion and it's the same, so, here one is then at the mercy of the mappings. One could rewrite the mapping rules (in the desktop Notion at least - not iOS) to then trigger, say, the expressive legato EWQL patch on a dim+accent+tenuto and in that case one would achieve an effect very much like what you want. I tried playing just that patch (expressive legato) in EWQL Play and it's - I think - what you are asking for. However, there's no way to make Notion itself do it in Notion.

    However, what one could do on iOS - messy though it is on iOS - is mute the other parts, export individually, rinse, repeat, import into, say Auria Pro and then use volume automation over that note to achieve a better emulation. Now, consider that though that sounds like a lot of work, it's exactly what desktop instrumenters do in DAWs like Cubase, Pro Tools, Studio One, Logic etc. - they fine tweak the expressions, voice choices and dynamics. So, it's not really any much more work (except the awkward exporting) in Notion for iOS.

    Eh! :neutral:

    Again, it's a limitation of current scoring programs - Sibelius, Finale included (but there is hope Dorico will do better), that they don't provide as broad a range of articulation and expression mappings as one would like. The nice thing about Notion desktop is that one can dig in to the mappings and alter and add to them to do what I have suggested and then one would not have to do the messy DAW thing on top.

  • Some day I'll have time to do this stuff at home on an actual computer! Until then, the limits are the limits. I just wanted to see if there was an option I was missing. I'll try the dim with strong accent + tenuto as you indicated. It also occurred to me to shorten my note and tie it to a second note on the same pitch but at a lower dynamic level. But I have a feeling that's going to sound weird.

    The 'stem' based (mute all but clarinet, etc) workaround you described is not so onerous, really. If this piece gets to be important for me, that would be worth the time. I would probably be exporting audio from Notion in order to have more reverb options anyway, if I get to where I want to send one of these Notion projects out into the world. I should actually make a feature request on PreSonus' forum for individual 'send' knobs in the Mixer. I have no objection to Notion's onboard reverb per se, but I'd rather be able to apply more of it to some instruments than others.

  • @JonLewis said:
    Some day I'll have time to do this stuff at home on an actual computer! Until then, the limits are the limits. I just wanted to see if there was an option I was missing. I'll try the dim with strong accent + tenuto as you indicated. It also occurred to me to shorten my note and tie it to a second note on the same pitch but at a lower dynamic level. But I have a feeling that's going to sound weird.

    The 'stem' based (mute all but clarinet, etc) workaround you described is not so onerous, really. If this piece gets to be important for me, that would be worth the time. I would probably be exporting audio from Notion in order to have more reverb options anyway, if I get to where I want to send one of these Notion projects out into the world. I should actually make a feature request on PreSonus' forum for individual 'send' knobs in the Mixer. I have no objection to Notion's onboard reverb per se, but I'd rather be able to apply more of it to some instruments than others.

    Right - in any case I do that typically (i.e. export without Notion reverb). I like using AltiSpace. Plus I can do some gentle work with Pro-EQ, Pro-MB and Pro-L in Auria Pro as needed.

    Exporting the stems is a bit onerous for a full orchestra score, but not so bad with, say a string quartet. Then one can do more individual work of course.

  • After reading all the opinions and other experience, and after trying SSO on Auria Pro, I think this will be much easier and faster on a PC.

  • @MusicInclusive, have you ever talked to Presonus and Wavemachine about the possibility of getting key switching in their apps? It seems like the major limitation to really getting great midi. My composition teacher just finished a sound track for a documentary, all done in Sibelius and Logic on the Mac.

  • @rickwaugh said:
    @MusicInclusive, have you ever talked to Presonus and Wavemachine about the possibility of getting key switching in their apps? It seems like the major limitation to really getting great midi. My composition teacher just finished a sound track for a documentary, all done in Sibelius and Logic on the Mac.

    I will definitely consider bringing it up. Appreciate the prod :+1: :smiley:

  • Bumping an older thread here, but hopefully my question is somewhat on-topic. I've got a sparsely arranged orchestral piece I'm working on with only 7 or 8 different instruments. I'm wondering how some of you would approach the reverb. Would you apply the reverb to each individual track? Or would it be more realistic to have the tracks completely dry and apply the reverb to the entire mix? I'm actually thinking of having the tiniest bit of a room reverb applied to the individual tracks (just to provide a small sense of space for each instrument), and then apply probably a small hall to the entire mix. Any thoughts?

  • @Joel75 said:
    Bumping an older thread here, but hopefully my question is somewhat on-topic. I've got a sparsely arranged orchestral piece I'm working on with only 7 or 8 different instruments. I'm wondering how some of you would approach the reverb. Would you apply the reverb to each individual track? Or would it be more realistic to have the tracks completely dry and apply the reverb to the entire mix? I'm actually thinking of having the tiniest bit of a room reverb applied to the individual tracks (just to provide a small sense of space for each instrument), and then apply probably a small hall to the entire mix. Any thoughts?

    What is ideal is to apply specific distance related early reflections to each instrument individually (i.e. you are using reverb to identify where on the stage in depth an instrument is, in addition of course to panning it left / right and narrowing it's stereo field occupancy; more = farther away, less = nearer), and then decide whether to apply a reverb tail across the whole piece - either as a 50% insert on the master or as individual sends from the tracks. Using individual sends also gives you the ability to either / or both pre / post process the signal with EQ that's going to the reverb.

    In addition you can also add cross early reflections that are panned over to the other side of the stage with another set of reverbs.

    Depending on the reverb, you may be able to accomplish some or all in one plugin. On the desktop I use two: Waves Trueverb for the early reflections (and the cross early reflections) and then Waves IR-L for the tail - along with some additional pre-delay. If you shell out for something like VSL's MIR or VSS 2 on the desktop, then it does all this for you.

    Oh, you also need to decide whether your reverb is post or pre / pan as well to determine more directionality - i.e. add some direction to the reverb itself.

  • Check the recent addition to the Auria Pro Tips and Tricks thread too concerning panning with M/S EQ.

  • Thanks, @MusicInclusive. Even though I'm panning the individual instruments, I hadn't really considered the interaction/relationship between panning and reverb. As I'll be doing this all on the iPad, I'm probably going to use Pro-R inside Auria. Thanks for the food for thought!

Sign In or Register to comment.