Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Lets talk about spatalization...

I think this is still one of the weak parts of iOS or the lack of tools. I´m open for suggestions of course.
So i prefer my acoustic sounds (also synths often lately) dry and put them in my own space.
But a good reverb is just the half of the thing. Now i want to put the sound into a virtual space. Easy said like you would build an orchestra. I also like to do it with synths. Like an orchestra of synths where they all in the same space (hall/room etc.) but not at the same place (so one might be 45 degrees right and closer and so on).
As i learned just some days ago you should put these tools before your reverb. The only thing which comes close to what i want is VirtualRoomPro but it really colors/change the sound source extreme and seems to only work proper on it´s own.
Simple panning is no option. There are some good things for mac but i would like to do similar things on my iOS device as well.
The short question. Is there anything out there which could do this on iOS?
I just used these tools since short time and sometimes the effect is very subtle....until you remove it and realize you can´t live without it anymore :D

«1

Comments

  • Along with Virtual Room Pro, I would recommend this DDMF Directional EQ plugin https://ddmf.eu/directionaleq-equalizer-plugin/

  • Thx...will check that out if it can do the job.
    But is there really nothing else like Virtual Room Pro but without any coloring of the sound source.
    For some reason Virtual Room Pro followed by a reverb sounds not like i would expect.

  • edited October 2018

    In a real space, things that are closer to you, are drier, naturally, compared to things in the back of the room, but also, there is a time delay between when you hear the direct sound, and the first bit of reverb. This is because the speed of sound makes the sound source have to take a round trip, from the source, to the back wall, and then forward again to your ears. Roughly 1ms per foot/3ms per meter. Sounds at the back of the room, arrive with the reverb right on top of them. Sounds close to you arrive well ahead of the reverb. This spatial placement is approximated with the predelay knob on your reverb. Say you wanted to make a sound, sound like it is 20 feet from the back wall, you would try adding 40ms of predelay, for the time it takes for the sounds round trip to hit the wall, and then your ears.

    The pre delay, is interesting, too, because sounds are clearer, crisper, because the reverb is more distinct/seperate from the direct sound, whereas with no pre delay, the sounds are washier.

  • These are definitely areas I need to think about more

  • @Processaurus said:
    In a real space, things that are closer to you, are drier, naturally, compared to things in the back of the room, but also, there is a time delay between when you hear the direct sound, and the first bit of reverb. This is because the speed of sound makes the sound source have to take a round trip, from the source, to the back wall, and then forward again to your ears. Roughly 1ms per foot/3ms per meter. Sounds at the back of the room, arrive with the reverb right on top of them. Sounds close to you arrive well ahead of the reverb. This spatial placement is approximated with the predelay knob on your reverb. Say you wanted to make a sound, sound like it is 20 feet from the back wall, you would try adding 40ms of predelay, for the time it takes for the sounds round trip to hit the wall, and then your ears.

    The pre delay, is interesting, too, because sounds are clearer, crisper, because the reverb is more distinct/seperate from the direct sound, whereas with no pre delay, the sounds are washier.

    Like i said. A reverb is just half of the thing if you try to get a realistic result but with many sound sources in the same space.
    Of course there are tons of parameters. The early reflections f.e. have more energy if you are closer and whatever.

  • Things also sound crisper, and bassier, when they are close to you. Rolling off some of the highs, (like somewhere above 5khz to 10khz), can help the illusion of something being pushed back. I've had better luck with the high shelf eq, rather than the lowpass filter.

  • I need new headphones :D

  • @Fruitbat1919 said:
    I need new headphones :D

    I need more lifetime to explore that all :D :D

  • @Cib said:

    @Fruitbat1919 said:
    I need new headphones :D

    I need more lifetime to explore that all :D :D

    That’s my advantage, I have lots of spare time in between head pains lol

  • And a little example to fake a sound which jumps trough a space :)
    First dry...then the staging tool and reverb added.
    The dry sound is the same synth 4 times layered but each get a midi repeater more.
    Also each instance here f.e. has different settings....easy said it will go from left close to right far in 4 big steps.
    Of course more steps and more fine steps would be possible as well.
    I wanted to try similar with Model D VirtualRoomPro and AudioReverb but i failed so far.
    But i won´t give up.

  • edited October 2018

    As long as it's not about math correct virtual space design I found AUM a very useful tool.
    In a simplfied layout I use an input channel with 2 bus sends: bus A for the dense small to mid rooms by AD480, bus B for the more airy wide stuff with the Amazing Noises Reverb FDN for modulated tails and AuFx Reverb for a more static response.
    (only one of them engaged at time)

    Bus A goes to main out, but also sends to bus B for some wideness added - if required.
    Mixing from dry to completely faded out ambiences is done by just the 3 faders of input channel, bus A and bus B, which delivers a quick and precise result.
    Each bus can have additional AUM's channel processors engaged for eq, pan and dynamic.

    The Amazing Noises Reverb FDN was quite a surprise in small ambience settings.
    Got a 'very clean' character on it's own, but the physical impression was convincing.
    Reversing the processing sequence and 'blur' it's small (or even tiny) rooms by the AD480 is an easy switch in AUM (not tried yet).
    As my focus was guitar ambience, I didn't engage panning yet - but I couldn't resist to replace the guitar by the Casio SK-1, and that was rightout stunning.

  • On my album Electric Ian I used the binaural planner quite a lot, at the mixdown and mastering stage in Logic Pro X. Obviously I didn’t binpan everything, just selected tracks, typically lead synth lines, or prominent padding lines, and suchlike (not voice, and not percussion). I didn’t want to make a thing about it, so I’ve never mentioned it, but it did give me the opportunity to position things in a 3D sphere to pull the sound apart quite usefully (and in most cases, automate the positioning somewhat). You shouldn’t really be able to notice it, I’ve kept it below ridiculous on the meter.

  • @u0421793 said:
    On my album Electric Ian I used the binaural planner quite a lot, at the mixdown and mastering stage in Logic Pro X. Obviously I didn’t binpan everything, just selected tracks, typically lead synth lines, or prominent padding lines, and suchlike (not voice, and not percussion). I didn’t want to make a thing about it, so I’ve never mentioned it, but it did give me the opportunity to position things in a 3D sphere to pull the sound apart quite usefully (and in most cases, automate the positioning somewhat). You shouldn’t really be able to notice it, I’ve kept it below ridiculous on the meter.

    But panning doesn´t work well in this case. Sometimes you just hear the difference if you remove your panning with these special tools.
    The free tool like Panagement f.e. is nice (even offers LFO´s to move trough the space but not for iOS) but also not quite accurate i think.
    The thing is you should all have your early and late reflections and parameters about the space you want to put your instruments in and the staging tool should do the rest without any color of sound.

  • edited October 2018
    The user and all related content has been deleted.
  • @Telefunky said:
    As long as it's not about math correct virtual space design I found AUM a very useful tool.
    In a simplfied layout I use an input channel with 2 bus sends: bus A for the dense small to mid rooms by AD480, bus B for the more airy wide stuff with the Amazing Noises Reverb for modulated tails and AuFx Reverb for a more static response.
    (only one of them engaged at time)

    Bus A goes to main out, but also sends to bus B for some wideness added - if required.
    Mixing from dry to completely faded out ambiences is done by just the 3 faders of input channel, bus A and bus B, which delivers a quick and precise result.
    Each bus can have additional AUM's channel processors engaged for eq, pan and dynamic.

    The Amazing Noises Reverb was quite a surprise in small ambience settings.
    Got a 'very clean' character on it's own, but the physical impression was convincing.
    Reversing the processing sequence and 'blur' it's small (or even tiny) rooms by the AD480 is an easy switch in AUM (not tried yet).
    As my focus was guitar ambience, I didn't engage panning yet - but I couldn't resist to replace the guitar by the Casio SK-1, and that was rightout stunning.

    That might work until a certain point and for some things but normally you would use these FX not as sends but on each track on it´s own.
    Also in this case you won´t use any panning since the staging FX will do it all.

  • edited October 2018

    @Cib said:
    And a little example to fake a sound which jumps trough a space :)
    First dry...then the staging tool and reverb added.
    The dry sound is the same synth 4 times layered but each get a midi repeater more.
    Also each instance here f.e. has different settings....easy said it will go from left close to right far in 4 big steps.
    Of course more steps and more fine steps would be possible as well.
    I wanted to try similar with Model D VirtualRoomPro and AudioReverb but i failed so far.
    But i won´t give up.

    What is "the staging tool"? Sound coooOOOL!

  • @Max23 said:

    @Cib said:
    Simple panning is no option. There are some good things for mac but i would like to do similar things on my iOS device as well.
    The short question. Is there anything out there which could do this on iOS?
    I just used these tools since short time and sometimes the effect is very subtle....until you remove it and realize you can´t live without it anymore :D

    the move a virtual sound object around in a room thing that works on headphones doesn't exist on iOS

    but a different idea would be to have the predelay of the reverb synced to the tempo of the music like 1/64 or something ...

    Of course reverbs with sync predelay exist but the tempo of your music doesn´t have much to do with the space you might want.
    The 2CAudio example i posted (the text) is indeed the best yet and maybe even a new approach.
    Their new FX is called Precedence and you have to put it before the reverb. Then in the reverb (in this case Breeze 2.1) you have to set the mix/balance to the same amount as the distance setting in Precedence to get a perfect match.
    It might be very subtle in some cases (in some not) but it´s really a great thing.
    I´m sure you can get quite nice results with the right combination of reverbs and good settings on iOS but it´s again a pain. I just tryed to place some acoustic instruments like iSymphonic on a virtual stage but it just sounds too fake and it all get messed up the more you want to add.

  • The user and all related content has been deleted.
  • @AudioGus said:

    @Cib said:
    And a little example to fake a sound which jumps trough a space :)
    First dry...then the staging tool and reverb added.
    The dry sound is the same synth 4 times layered but each get a midi repeater more.
    Also each instance here f.e. has different settings....easy said it will go from left close to right far in 4 big steps.
    Of course more steps and more fine steps would be possible as well.
    I wanted to try similar with Model D VirtualRoomPro and AudioReverb but i failed so far.
    But i won´t give up.

    What is "the staging tool"? Sound coooOOOL!

    Oh it´s just the thing here (together with reverb since it just one half of the staging).
    https://2caudio.com/products/precedence#_Overview
    Not iOS (but it would easy run tons of instances on an iPad for sure).
    I really thought VirtualRoom can do at least something similar but it just don´t work well.
    However, i will experiment further. If not.....add it to my wishlist of FX i want for iOS.
    It might more important for orchestrial works rather than electronic music but also here i start to experiment with these things. Sound should be as 3D as it can get for me :) But then i never would even touch these crappy Beatz things or Apple´s headphones and call me an audiophil. Most people might not need these things at all.

  • @Cib actually the example layout was about a single track (guitar).
    The Amazing Noises Reverb FDN can have multiple AU instances, so it could be changed for multiple synths, just the AD480 is IAA.
    Depends on personal taste if you want to apply it as an additional tonal stage - imho it's different enough from the pack to be a valid option when it comes to IOS reverbs.

  • @Max23 said:
    hm, its supposed to work together - level, panning and reverb to have the virtual room thing it will be a drag to set this up with different plugins

    That´s the bad thing.
    Of course i prefer to have it in one tool (or 2 which offers settings to coexist).
    The other bad thing is while i tryed it again i get this buzzing noise and my Audiobus session broke again.
    The moment i hate iOS again....but i won´t give up so fast.

  • @Telefunky said:
    @Cib actually the example layout was about a single track (guitar).
    The Amazing Noises Reverb FDN can have multiple AU instances, so it could be changed for multiple synths, just the AD480 is IAA.
    Depends on personal taste if you want to apply it as an additional tonal stage - imho it's different enough from the pack to be a valid option when it comes to IOS reverbs.

    Thank´s. I really wish there were more videos about this specific usage or i could demo some apps. I bought so much and it adds up but mostly i got a bit dissapointed at the end.

  • edited October 2018

    @Cib said:

    @Telefunky said:
    @Cib actually the example layout was about a single track (guitar).
    The Amazing Noises Reverb FDN can have multiple AU instances, so it could be changed for multiple synths, just the AD480 is IAA.
    Depends on personal taste if you want to apply it as an additional tonal stage - imho it's different enough from the pack to be a valid option when it comes to IOS reverbs.

    Thank´s. I really wish there were more videos about this specific usage or i could demo some apps. I bought so much and it adds up but mostly i got a bit dissapointed at the end.

    Not as advanced but I imagine you tried...?

    https://klevgrand.se/products/haaze/

    (ios with free pc demo)

    https://www.youtube.com/watch?time_continue=47&v=AKREu-DIlPw

  • edited October 2018

    @Cib Sorry, I missed your example above while writing... makes it much clearer what you're about. Should be cool with the Amazing Noises thing, as it can be midi-automated.
    (lfo the pan and alter the decay lenght in quarter steps - there are several parameters as 'fade out' candidates)
    ps: quick check for completeness - panning or level isn't critical, but other parameters should be send as midi-cc at proper intervals (or together with notes), otherwise the reverb has no time to decay and turns into a swoosh generator.

  • The user and all related content has been deleted.
  • edited October 2018

    @Max23 said:
    btw. I think convolution beats the hell out of any algorithm in realistic sounding reverbs ...

    I totally disagree here. While some convolution reverbs are amazing i much prefer algorithmic reverbs.
    Normally an IR just give you something like a photo while an algo verb gives you a movie.
    A real space also will never sound the same in the next second with the same instruments/note. It´s all breathing.
    However, sometimes the right IR might do the best job if you need a specific space.
    But also it set your instrument all at the same place in the space. Which is not natural at all.
    Space Designer for Garage Band would be welcome :)

  • edited October 2018
    The user and all related content has been deleted.
  • edited October 2018

    Another little example where i put dry strings (at least as dry as close mics can get) a bit to the left and more far away and a dry violin more close and more to the right but in the same space (reverb). Each instrument got the same reverb setting but with different mix settings and distance/angle in the staging tool.
    Of course it´s not fine tuned and better results can be achieved.

  • I gotta disagree. I don't know if there any top-notch convolution reverbs on iOS, but something like Altiverb is unsurpassed in placing something in what seems like a real space. It is useful both for reverb and placing something in position. There are some awesome examples with virtual orchestra instruments really putting the orchestra sections in space.

    @Cib said:

    @Max23 said:
    btw. I think convolution beats the hell out of any algorithm in realistic sounding reverbs ...

    I totally disagree here. While some convolution reverbs are amazing i much prefer algorithmic reverbs.
    Normally an IR just give you something like a photo while an algo verb gives you a movie.
    A real space also will never sound the same in the next second with the same instruments/note. It´s all breathing.
    However, sometimes the right IR might do the best job if you need a specific space.
    But also it set your instrument all at the same place in the space. Which is not natural at all.
    Space Designer for Garage Band would be welcome :)

Sign In or Register to comment.