Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Sampler audio unit needed!

Have anyone heard of a sampler synth audio unit that could fit into an effect slot?
Right now I use elsa which does it as an iaa but app switching is such a workflow killer, I'll always look for an auv3 Alternative.

Anything in the pipeline?
Would it be feasable, I mean could a midi keyboard be used to play an Auv3 effect?

Thanks

«1

Comments

  • @Philippe said:
    Have anyone heard of a sampler synth audio unit that could fit into an effect slot?
    Right now I use elsa which does it as an iaa but app switching is such a workflow killer, I'll always look for an auv3 Alternative.

    Anything in the pipeline?
    Would it be feasable, I mean could a midi keyboard be used to play an Auv3 effect?

    Thanks

    The problem with the effect slot AU is they cannot accept midi. Not yet anyways. It might change with Ios11. I could be mistaken though since I vaguely remember this from a conversation I had with @brambos

  • Main question is: why in an effect slot instead of an instrument slot?

  • @brambos said:
    Main question is: why in an effect slot instead of an instrument slot?

    Because effect slot can receive audio and sample it directly.

  • not shure if it fits your requirements, but I found AUM's virtual busses quite usefull in this context.

  • @Philippe said:

    @brambos said:
    Main question is: why in an effect slot instead of an instrument slot?

    Because effect slot can receive audio and sample it directly.

    Sampling into an instrument is ok though. Consider Loopy.

  • I know he's busy but you could always run your question past Micheal Tyson. He's a plethora of knowledge and you might even ask him if there's any interested developers you could suggest your idea to as well. He's really good about replying and a super pleasant individual

  • The new Samplerbot app could do the trick here...

  • It would be great if @VirSyn made ReSlice an AU.

  • @gusgranite said:
    It would be great if @VirSyn made ReSlice an AU.

    It is...

  • Beatmaker 3 Sampler samples from AU, IAA or Audiobus 3.

  • @gonekrazy3000 said:

    @Philippe said:
    Have anyone heard of a sampler synth audio unit that could fit into an effect slot?
    Right now I use elsa which does it as an iaa but app switching is such a workflow killer, I'll always look for an auv3 Alternative.

    Anything in the pipeline?
    Would it be feasable, I mean could a midi keyboard be used to play an Auv3 effect?

    Thanks

    The problem with the effect slot AU is they cannot accept midi. Not yet anyways. It might change with Ios11. I could be mistaken though since I vaguely remember this from a conversation I had with @brambos

    iDensity has an AU that can go into the effects slot, be controlled via MIDI, and can record audio so clearly it's already possible to create an AU sampler that goes in the effect slot with these criteria.

  • Interesting, I'll check that out.

  • @Dubbylabby said:

    @gusgranite said:
    It would be great if @VirSyn made ReSlice an AU.

    It is...

    Sorry, I meant as an AU Effect so an instrument can be routed to it in AUM without IAA app switching (like the OPs request).

    Or am I missing something here?...

  • @gusgranite said:

    @Dubbylabby said:

    @gusgranite said:
    It would be great if @VirSyn made ReSlice an AU.

    It is...

    Sorry, I meant as an AU Effect so an instrument can be routed to it in AUM without IAA app switching (like the OPs request).

    Or am I missing something here?...

    Well the fx slot is for fx meanwhile a sampler is a generator or recorder. Sampling as realtime fx usually took the delay (or tape recorder) with feedback at 100% wet. It also becames short buffer lenght (usually limited by hardware memory chips in delay pedals) and limited to hold/repeat since these units wasn't developed with "chopping" in their conception.
    The kind of effect you describe sounds like moldover one to me...

    Idk if there anything in the market like this (even on desktop) with the chopping cappabilities. Time ago I dreamed with something similar in maxforlive with vinyl timecode control for my crazy turntablism ideas... and mlrv was the nearest outside Reaktor or Tim Exile NI plugs...

    BlocsWave will be my first choice and adapt my workflow instead wait for AuFx sampler but maybe it's me who missed something in your workflow, sorry.

    Hope it helps, mate.

  • @Philippe said:
    Have anyone heard of a sampler synth audio unit that could fit into an effect slot?
    Right now I use elsa which does it as an iaa but app switching is such a workflow killer, I'll always look for an auv3 Alternative.

    Anything in the pipeline?
    Would it be feasable, I mean could a midi keyboard be used to play an Auv3 effect?

    Thanks

    Also knowing how it's used (since I don't know Elsa) could be useful to find a way :wink:

  • Are there AU hosts that allow sending MIDI to AU effects? I don't think there are?

  • Great info @Dubbylabby! Appreciate it.

  • What we need is more apps that can host IAA. A looper especially. The you can route audio to it easier via AUM. I would love to see blocswave as an IAA host inside of AUM.

  • edited August 2017

    @gmslayton
    Did you tried GTL? Not sure if it does what you are describing but maybe it will in the next release... ¿?

    Details:
    http://forum.grouptheloop.com/index.php?p=/discussion/116/finally-some-new-features-of-gtl-snapshots#latest

  • @gusgranite said:
    Great info @Dubbylabby! Appreciate it.

    You are welcome, mate! If I can do anything else... :wink:

  • @brambos said:
    Are there AU hosts that allow sending MIDI to AU effects? I don't think there are?

    Yeah, I think this is mostly a host issue, the AU type is Music Effect, but it is very possible that many iOS hosts may not provide a way to get the midi routed in there. That said I haven't tried it and probably shouldn't be posting prematurely.

  • edited August 2017

    @sonosaurus said:

    @brambos said:
    Are there AU hosts that allow sending MIDI to AU effects? I don't think there are?

    Yeah, I think this is mostly a host issue, the AU type is Music Effect, but it is very possible that many iOS hosts may not provide a way to get the midi routed in there. That said I haven't tried it and probably shouldn't be posting prematurely.

    Audiobus 3, AUM, and BM3 all support sending MIDI to AU Music Effect apps. iDensity is the only AU effect app that I'm aware of. Both BM3 and AUM both support MIDI control of exposed AU effect parameters but unfortunately Audiobus 3 does not as this is a great way to get your effects in sync. The MIDI stream including using the great MIDI effect apps from Audiobus 3 can be streamed to AU apps hosted in AUM or BM3 to control them.

  • @gusgranite said:

    @Dubbylabby said:

    @gusgranite said:
    It would be great if @VirSyn made ReSlice an AU.

    It is...

    Sorry, I meant as an AU Effect so an instrument can be routed to it in AUM without IAA app switching (like the OPs request).

    Or am I missing something here?...

    As @Telefunky noted, you can route audio in AUM with buses and then have an AU effect looper app which records to receive the audio source. You could send this output through other effects and resample it back into the AU app.

  • @InfoCheck said:

    @sonosaurus said:

    @brambos said:
    Are there AU hosts that allow sending MIDI to AU effects? I don't think there are?

    Yeah, I think this is mostly a host issue, the AU type is Music Effect, but it is very possible that many iOS hosts may not provide a way to get the midi routed in there. That said I haven't tried it and probably shouldn't be posting prematurely.

    Audiobus 3, AUM, and BM3 all support sending MIDI to AU Music Effect apps. iDensity is the only AU effect app that I'm aware of. Both BM3 and AUM both support MIDI control of exposed AU effect parameters but unfortunately Audiobus 3 does not as this is a great way to get your effects in sync. The MIDI stream including using the great MIDI effect apps from Audiobus 3 can be streamed to AU apps hosted in AUM or BM3 to control them.

    I knew I should have just waited for your response :)

  • edited August 2017

    I don't mind midi sync much, but precise alignement of audio streams is the biggest IOS flaw atm.
    It's not really OS based of course, but may be considered general as no apps support it in any reasonable way. The AUM adjustment is essentially unusable and covers just one single aspect.

    Each stream/mixer channel should have it's own adjustment option.
    (which is basically soloing the 2 channels, a quick polarity flip of one and adding single samples until the sum of the two is extinguished to minimal level.
    Then polarity is flipped back and both channels are perfectly aligned)
    Otherwise the mixdown may significantly deviate from the monitored impression if the engine itself compensates (in case the processing delay is known), but the latter is uncertain in many cases.

    You may not even notice the difference because you're used to this kind of sound and take it as naturally given.
    Just as people consider their (say) $100 interface preamps 'great' until they hear a real Neve or Focusrite ISA/Red.

    On desktop I don't mind this misalignement much because I can simply shift the audio for the proper amount of single samples in case it applies.
    (it's only relevant when a signal is split and processed in parallel paths)
    But don't even think of doing such operations in Auria.
    (Twisted Wave would be cool, if it only could do multitracks...)

    In case you wonder wtf this is about: set an arbitrary mix to -6db max level, duplicate it on another track, shift it for 10-20 samples and set volume to zero.
    Then playback and move the 2nd track's fader up and down o:)
    Lower amounts in the 5 sample range will slur transients and make the result less pronounced.

    ps: this is in no way related to audio buffer latency, which is a completely different thing

  • @Telefunky said:
    I don't mind midi sync much, but precise alignement of audio streams is the biggest IOS flaw atm.
    It's not really OS based of course, but may be considered general as no apps support it in any reasonable way. The AUM adjustment is essentially unusable and covers just one single aspect.

    Each stream/mixer channel should have it's own adjustment option.
    (which is basically soloing the 2 channels, a quick polarity flip of one and adding single samples until the sum of the two is extinguished to minimal level.
    Then polarity is flipped back and both channels are perfectly aligned)
    Otherwise the mixdown may significantly deviate from the monitored impression if the engine itself compensates (in case the processing delay is known), but the latter is uncertain in many cases.

    You may not even notice the difference because you're used to this kind of sound and take it as naturally given.
    Just as people consider their (say) $100 interface preamps 'great' until they hear a real Neve or Focusrite ISA/Red.

    On desktop I don't mind this misalignement much because I can simply shift the audio for the proper amount of single samples in case it applies.
    (it's only relevant when a signal is split and processed in parallel paths)
    But don't even think of doing such operations in Auria.
    (Twisted Wave would be cool, if it only could do multitracks...)

    In case you wonder wtf this is about: set an arbitrary mix to -6db max level, duplicate it on another track, shift it for 10-20 samples and set volume to zero.
    Then playback and move the 2nd track's fader up and down o:)
    Lower amounts in the 5 sample range will slur transients and make the result less pronounced.

    ps: this is in no way related to audio buffer latency, which is a completely different thing

    Good points yet I feel until more professionally focused markets arrive on iOS, I doubt we'll see much progress in this area. How many people do you think use the compensations mentioned in AUM or are even aware of them?

  • edited August 2017

    Nobody - because even this single item is barely usable at all ;)
    You're perfectly right, but today such stuff should be standard in any digital mixer or recording system.

    Phase-coherency was frequently overlooked up to the early 2k years, but since then it's well known, for a time even exaggerated into areas where it's not relevant at all.
    While people tried to proove that all digital mixing engines are created equal because they deliver identical results in summing, it's in fact the signal alignement before final summing that can make a significant difference.

    I've been lucky to have an OS independant DSP system in which every part of the routing from input/output to mixer including each single plugin position could be checked for sample accurate signal runtime (if it seemed appropriate). Quite revealing.

    Not that I used the feature extensively, but it was nice to be around.
    As mentioned I prefer the manual shift of tracks because I usually record dry and fx on dedicated channels - and that shift is just a snap.

    I agree that IOS in it's current state isn't too 'professional', but it has an enormous potential.
    It's audio performance really surprised me from day one on. And it's fast and silent.

  • 8 months later... AUv3 sampler?

  • As far as I know AUv3's can not directly access the files from the host and have very limited file-access (they can access the files in their own app container but that's about it) so in order to transfer files to the plug-in work-arounds would be needed. In the case of for example Tardigrain the 'copy' function copies the sample to the apps shared file-area and the file then needs to be 'pasted' into the AUv3. Hardly a smooth solution. And in the case of BeatHawk the sample management(import&export) has to be done in the stand-alone app.

    So what to the end-user feels like a 'simple' thing is actually not so simple, if it was it would have already been done.

  • @Samu said:

    In the case of for example Tardigrain the 'copy' function copies the sample to the apps shared file-area and the file then needs to be 'pasted' into the AUv3. Hardly a smooth solution.

    So what to the end-user feels like a 'simple' thing is actually not so simple, if it was it would have already been done.

    I was wondering about using Tardigrain as even just a barebones AUv3 sampler. This is discouraging.

Sign In or Register to comment.