Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VirtualRoom AU is out (AUv3, works with AB3)

2»

Comments

  • @Fritz said:
    VirtualRoom AU v1.1 is out, and it can do nearfield sources, too. Have a look at the "virtual barbershop" type demo on YouTube:

    Just out of curiosity, what did you use to record and automate the 'stuff'?
    I've somewhat successfully been experimenting with BM3 Beta by loading the VirtualRoom AU to an AUX buss and feeding two channels pre-fader panned L & R to it giving me two separate inputs into the same 'room'.

    Another option would be to use an interface with two separate mics but that in turn would most likely cause some channel bleed unless those two channels are recorded separately.(ie. on pass for the 'talking' and one pass for the 'razor').

  • @Fritz said:
    VirtualRoom AU v1.1 is out, and it can do nearfield sources, too. Have a look at the "virtual barbershop" type demo on YouTube:

    Hi Fritz, I was excited to see the midi control of parameters in this update, but trying it out in Cubasis doesn't show it as a midi destination and using Virtual midi or network midi doesn't yeald any control either, there seems to be nowhere to set incoming midi channel on the virtual room plug in at all? so if i draw in CCs 20 & 21 it moves nothing even with ALL midi outputs and channels set open..
    love this plug but am desperate to get beyond just automating amount of send to it ( though that does give a nice effect )
    any advice?

  • @RockySmalls said:

    @Fritz said:
    VirtualRoom AU v1.1 is out, and it can do nearfield sources, too. Have a look at the "virtual barbershop" type demo on YouTube:

    Hi Fritz, I was excited to see the midi control of parameters in this update, but trying it out in Cubasis doesn't show it as a midi destination and using Virtual midi or network midi doesn't yeald any control either, there seems to be nowhere to set incoming midi channel on the virtual room plug in at all? so if i draw in CCs 20 & 21 it moves nothing even with ALL midi outputs and channels set open..
    love this plug but am desperate to get beyond just automating amount of send to it ( though that does give a nice effect )
    any advice?

    This is most like a 'feature' that is still missing in Cubasis and only @LFS knows what's cooking.
    Maybe the next Cubasis update will enable IAA/AUv3 plug-in automation?

    The bare minimum could be for each track to optionally transmits the track-midi to all plug-ins (effects or instruments) assigned to a track and not just to the 'instrument'.

    Hopefully most DAWs will have some form of AUv3 automation in the near future.

  • @Samu
    yep, au automation would be a huge leap in the right direction for Cubasis..
    for some reason I have a memory of midi-ing effects up before in Cubasis, after asking Johan at klëvgrand about automation and him sending me the few midi CC's that their apps utilize.
    but i could just be having another senior moment and it was all just au instruments.
    I don't want to get involved in a complex Aum/audibus into cubasis set up and BM3/auria are workflow killers for me.. this virtual room plug is great but it seems I am back to recording any binaural movements live in audioshare and 'blind' rather than hearing the results directly in a DAW.
    I'm tagging @LFS and @Fritz just to see if anything might spark...

  • @ginandjuice said:
    Noob here...I love this app, can I recreate this effect on a desktop daw like cubase or fl studio?

    LPX had a binaural pan

  • I'm aware that the MIDI implementation still needs to improve (JUCE forum, I'm coming...). The plugin is set up as a MIDI "consumer", so in Audiobus it has to go to the last slot for MIDI, while for Audio it needs to go to the middle slot (-> see pictures).
    While I suppose MIDI input is somehow enabled for the standalone app, too, I haven't figured out how to set up a GUI to make connections. But that should come in a future version (funnily, JUCE automatically generates such a GUI on desktop platforms).

  • @Fritz said:

    @ginandjuice said:
    Noob here...I love this app, can I recreate this effect on a desktop daw like cubase or fl studio?

    Well, there's a VST version under development.

    I will definitely buy it when it’s released.

    So just to be clear... Recording automation only works in ab3 at the moment?

  • edited October 2017

    @ginandjuice said:

    So just to be clear... Recording automation only works in ab3 at the moment?

    Automation recording works quite well in BM3. As far as I know AB3 has no AUv3 'automation recording' so the automation must have been done using other methods.(external midi controller?).

  • Actually, the automation in BM3 works with the AUv3 parameters, but doesn't use information provided by the plugin when parameters changes are initiated via the UI. Apparently this is not possible while BM3 still supports iOS 9.3.

  • @Fritz said:
    Actually, the automation in BM3 works with the AUv3 parameters, but doesn't use information provided by the plugin when parameters changes are initiated via the UI. Apparently this is not possible while BM3 still supports iOS 9.3.

    I've been using the 'AUV3 Plug-In UI' of most AUv3's to record the automation in BM3 without any major hassle.
    Some plug-ins however do NOT update there UI in response to incoming AUv3 automation.

    Then again I'm on iOS11.0.3 at the moment...

  • @Samu said:

    @Fritz said:
    Actually, the automation in BM3 works with the AUv3 parameters, but doesn't use information provided by the plugin when parameters changes are initiated via the UI. Apparently this is not possible while BM3 still supports iOS 9.3.

    I've been using the 'AUV3 Plug-In UI' of most AUv3's to record the automation in BM3 without any major hassle.
    Some plug-ins however do NOT update there UI in response to incoming AUv3 automation.

    Then again I'm on iOS11.0.3 at the moment...

    I guess my statement was not quite clear. What I meant is that BM3 is not using the information when a slider movement starts and when it ends (which apparently can be provided only by more recent AUv3 implementations, which would break iOS 9.3 support). So it relies on a more heuristic method which includes read-backs after each time a parameter was changed, sometimes leading to lost automation steps. If your hardware is recent, you'll probably not notice any difference, but on older devices it can be quite noticeable.

  • @Fritz said:

    I guess my statement was not quite clear. What I meant is that BM3 is not using the information when a slider movement starts and when it ends (which apparently can be provided only by more recent AUv3 implementations, which would break iOS 9.3 support).

    That's right. It makes me quite sad that 'development has to be held back' due to fear of keeping the device up-to-date(The BM3 system requirements were discussed during beta). The irony is that those who fear the updates almost scream for features present in the update.

    iOS11.1 is just around the corner...

  • LFSLFS
    edited October 2017

    @Samu said:
    This is most like a 'feature' that is still missing in Cubasis and only @LFS knows what's cooking.
    Maybe the next Cubasis update will enable IAA/AUv3 plug-in automation?

    Hi Samu,

    Hope you're well...

    The team is working full steam on the next Cubasis update, which will include lots of great new features.
    As usual, details to follow once we're ready to release. Promise, it'll be worth the wait...

    Best,
    Lars

  • @LFS said:

    Hi Samu,

    Hope you're well...

    I'm mostly ok, thanks!

    A few additions to Cubasis I came to think about.

    • Stereo->Mono conversion options(Keep L/R & Mix) when editing an audio track and for audio loaded into the mini-sampler.
    • Split a Stereo Track to Two mono tracks.(Would enable individual processing of left and right channels of a stereo recording).
    • iOS11 Files.app integration (It's enough to expose the 'iTunes Folder' to the Files.app).

    The rest is already know by now file-management, triplet-grids, higher sequencer resolution, live histogram for EQs(Garageband does this as an example) and AUv3 automation and to avoid 'blowing the ears' please 0db output as default for the compressor instead of 10db...

    I do love surprises too and I have a feeling we'll not be disapointed by the next update ;)

  • @Samu said:

    @LFS said:

    I do love surprises too and I have a feeling we'll not be disapointed by the next update ;)

    Hope so too... ;)

  • @LFS said:

    @Samu said:

    @LFS said:

    I do love surprises too and I have a feeling we'll not be disapointed by the next update ;)

    Hope so too... ;)

    cubasis updates.. no big pre-hype, just another step in the right direction,
    " lots of great new features " sounds like more than just bug fixes! :)

  • @RockySmalls said:

    @LFS said:

    cubasis updates.. no big pre-hype, just another step in the right direction,
    " lots of great new features " sounds like more than just bug fixes! :)

    Definitely much more than just bug fixes, and lots of great new features included.
    And... will be another free update for existing users (in-app purchases excluded, of course)....

    Best,
    Lars

  • @Fritz said:
    I guess my statement was not quite clear. What I meant is that BM3 is not using the information when a slider movement starts and when it ends (which apparently can be provided only by more recent AUv3 implementations, which would break iOS 9.3 support).

    Could you elaborate please? As far as I know proper AU automation has been possible from the start? In fact my AUs (introduced during iOS9) all automate fine. You don't need to worry about slider movements etc. Just put a listener event on the Parameter Tree and you get notifications as soon as a value has changed. That way you can completely decouple automation from you UI implementation.

    :)

  • @LFS said:

    @RockySmalls said:

    @LFS said:

    cubasis updates.. no big pre-hype, just another step in the right direction,
    " lots of great new features " sounds like more than just bug fixes! :)

    Definitely much more than just bug fixes, and lots of great new features included.
    And... will be another free update for existing users (in-app purchases excluded, of course)....

    Best,
    Lars

    in-app purchases! not a problem, I bought the drum machine IAP to keep the faith, even though i have my own prefered drumboxes already minisampled. Small payments to keep it rolling are absolutely a must. keep it stable, keep it clean! :)
    ( p.s. au automation OR midi to FX plugs is clearly a necessity, I can't think of any other requests I could ask for cubasis that @Samu hasn't already outlined, but I'm certain you guys already understand that and are all over it for the near future. oh! maybe 12.9" screen support for those lucky or 'pro' users out there.)

  • I think 'Track Folders' could be one feature that could come in handy once the number of tracks start to increase.
    Another one would be 'Track/Channel Groups' and 'AUX Busses' that would allow more flexible signal routing.

    The 'sources' in the Mini-Sampler are currently somewhat limited to IAA-Generators(there seems to be no way to 'sample' an 'IAA-Instrument') and/or AudioBus sources in addition to the regular audio-input.

    Some form of 'midi-processor' could come in handy too. I still remember the Midi Effects & Filtering that I could use in a pre VST version of Cubase :) (And the custom 'Midi Mixer' was totally awesome and allowed one to create midi control surfaces that could be used when recording).

    I do get some users desperately need side-chanin compression but every time I her it 'over used' / 'abused' in certain genres of 'music' i feel like vomiting as I'm NOT a fan of the 'pumping sound'...

    Sometimes I'm almost too picky and forget to enjoy what we already have on iOS and focus too much on what we do NOT have yet :)

  • but having the Channel Strip Gate sidechain routable from another channel would be a tremendous boon. hours of volume automation drawing felled in one swoop! :)
    any road, this was supposed to be a thread about Virtual Room. Apologies Fritz, but fingers crossed a cubasis update with au automation will really open up the use of yr fine app without you having to mess with it.. bated breath.

    @Samu said
    "I do get some users desperately need side-chanin compression but every time I her it 'over used' / 'abused' in certain genres of 'music' i feel like vomiting as I'm NOT a fan of the 'pumping sound'..."

  • @brambos said:

    @Fritz said:
    I guess my statement was not quite clear. What I meant is that BM3 is not using the information when a slider movement starts and when it ends (which apparently can be provided only by more recent AUv3 implementations, which would break iOS 9.3 support).

    Could you elaborate please? As far as I know proper AU automation has been possible from the start? In fact my AUs (introduced during iOS9) all automate fine. You don't need to worry about slider movements etc. Just put a listener event on the Parameter Tree and you get notifications as soon as a value has changed. That way you can completely decouple automation from you UI implementation.

    :)

    Well, I’m not implementing that part in the details, as I’m using JUCE. But I’m calling JUCE’s begin/endChangeGesture which should give the required info to the host when a parameter is edited. But apparently BM3 is not using this information, but rather relies on writing and reading back parameters. Yes, all my GUI updates are done in a method called by a timer, so in theory GUI redrawing should not affect the parameter readout, but in practice it does, for whatever reason I don’t know. But as this seems to be a temporary problem, I’m not too worried about it.

  • @Fritz said:

    @brambos said:

    @Fritz said:
    I guess my statement was not quite clear. What I meant is that BM3 is not using the information when a slider movement starts and when it ends (which apparently can be provided only by more recent AUv3 implementations, which would break iOS 9.3 support).

    Could you elaborate please? As far as I know proper AU automation has been possible from the start? In fact my AUs (introduced during iOS9) all automate fine. You don't need to worry about slider movements etc. Just put a listener event on the Parameter Tree and you get notifications as soon as a value has changed. That way you can completely decouple automation from you UI implementation.

    :)

    Well, I’m not implementing that part in the details, as I’m using JUCE. But I’m calling JUCE’s begin/endChangeGesture which should give the required info to the host when a parameter is edited. But apparently BM3 is not using this information, but rather relies on writing and reading back parameters. Yes, all my GUI updates are done in a method called by a timer, so in theory GUI redrawing should not affect the parameter readout, but in practice it does, for whatever reason I don’t know. But as this seems to be a temporary problem, I’m not too worried about it.

    Ah thanks, that clears it up. It's a JUCE limitation then, not necessarily an AU/iOS limitation. I have no experience with JUCE so I'm not familiar with its constraints.

    B)

  • @brambos said:
    Ah thanks, that clears it up. It's a JUCE limitation then, not necessarily an AU/iOS limitation. I have no experience with JUCE so I'm not familiar with its constraints.

    B)

    I’m not sure about that. I think the problem is also linked to the automation implementation by BM3.

  • edited December 2017

    Hello people, hello Fritz.
    I have just bought this app and i am testing it in cubasis. It sounds really amazing as an effect plugin.
    My intention is to use it for a different purpose:
    As a virtual studio monitor fur headphone mixing.

    I usually mix my music on headphones but this is not easy. Music on headphones sounds different than on speakers. Especially when setting up the panning for instruments it is difficult to work with headphones.
    The idea is to use virtualroom for headphone mixing.
    In cubasis i loaded the auv3 plugin an i set the listener into a small room. The speakers placed in triangle like in a near field monitoring situation.

    It sounds really realistic, however the sound is brighter than without the plugin.
    Do the virtual speakers have a flat frequency response or is the virtual room colouring the sound due reflections? In the latter case it would be nice if the virtual room cold have frequency damping like a studio room. I think this would be a great selling point for this app.
    There is a demand for a heaphone mix tool
    https://forum.audiob.us/discussion/comment/388503/#Comment_388503

  • Good morning

Sign In or Register to comment.