Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Does Audiobus affect the sound quality of my apps ?

Hi,

Does Audiobus have any affect on the sound quality of the apps it busses ?

Comments

  • edited January 2013

    Occasionally I've noticed some pops and clicks when I'm overloading the iPad, but these sounds did not get recorded into Loopy or MTDaw. They didn't even cause any noticeable artifacts in the recorded files, which to me was a nice, unexpected surprise.

    I'm not an iOS dev, but from what I've seen and read and heard, Sebastian and Michael have made it a vital point NOT to interfere with the audio, but to construct a bridge for the audio to travel across from one app to another.

    ***removed unnecessary adjective which could have been interpreted as diminishing the work of the devs, which is not what I intended.

  • Audiobus lets you record and stream audio in 16bit, 44Khz. If you're asking yourself if it adds a lossy compression or something that could be heard in the final rendering of your project: no it does not.

  • Thanks guys,

    I have looked around the web and am trying to make some sense of this.

    Is the roof for the iPad 16bit, 44Khz or just Audiobus ?

    For example if Auria was to support Audiobus would it be possible to record from say Animoog at a higher rate of say 24bit, 96Khz into Auria.

    It's not clear to me what the music apps themselves output at...

    Can you make sense of it for me please....

  • No, the ipad is not capped at 16/44. At least, not anymore. But it also doesn't play very nice with HiRes Audio Files as I've seen on a number of occasions.

    People have managed to transfer 24/192 to iPads and output 24/96 for sure, using USB Digital Audio Converters. (In our case, an audio interface).

    It is likely that many more apps are going to support 24/96 in the future, on newer iPads, because my guess is that being able to generate, communicate, process, and record multiple streams of 24/96k is a little outside the capabilities of the iPad quite yet. I'm not sure if 4th Gen could handle it, but I doubt my 3 could do much good with it.

    The devs will have to be the ones to explain why Audiobus doesn't support 24/96 as of now, but my money is on processing limitations and therefore user experience as a big part of it. As the technology evolves, so will the app.

  • It's a matter of performance and bandwidth. iPads can handle pretty much anything, it's just a matter if you need it to be in real time. :)

  • Morning guys,

    Thanks for the responses, this discussion is very interesting...

    Still a bit wooly/vague though but I guess if your saying that Audiobus doesn't add any lossy compression, the iPad is not capped at 16/44 and iPads can pretty much handle anything then.......that would lead me to think that if I was running Multitrack DAW at 24bit/96k and recording in a single audio stream from iPolysix via Audiobus for example that it would record in at 24/96 as set in Multitrack DAW without any affect from Audiobus.

    Am I right ?

  • Audiobus streams audio in 44Khz, 16 bit. What Multitrack does with that, is up to MultiTrack, but it cannot improve the quality just by changing its internal settings.

  • The sound quality via Audiobus is the same as if you render/audiocopy a file in Korg's iPolysix. According to the iPolysix manual it is done at 44/16. I wouldn't be surprised if this is true for most synth apps.

  • Ok, thanks guys.

  • Any synth app that does more than 16/44 would be wasting processor cycles. Recording audio via a microphone at higher resolutions is a different story.

  • Hi Syrupcore,

    I'm just trying to educate myself I guess not really covered the subject of Ipad app sound quality before and it's been hard to get any clarity on the net.

    By wasting processor cycles do you mean it would not improve the sound quality of the app by having a higher bit depth or sample rate ?

  • Hi,

    After reading this article it's starting to make a bit more sense to me....

    http://www.head-fi.org/t/415361/24bit-vs-16bit-the-myth-exploded

    Thanks for your support.

  • You got it. That article is what I'm saying - said about 1000 times better!

    When you're capturing a very dynamic instrument like finger picked guitar or a drum set the extra dynamic range/headroom 24 bits provides make a lot of sense. Otherwise, it's superfluous (and that extra junk still needs to be processed by the CPU).

  • Wow. Thanks guys. I had no idea. Great find on that article.

  • Actually since almost all iOS music apps do their internal processing in floating point format, 16bit vs 24bit is not actually a CPU processing differentiator, more of a storage and memory one. However doing high sample rate stuff (96kHz, etc) definitely is an extra CPU, storage, and memory load.

  • Thanks Sonosaurus,

    I think my brain is about to melt with all this....

    I had to find and read this article to get some understanding....
    http://floating-point-gui.de/formats/fp/

    With what you just explained how then do you quantify the sound quality of a music synth app is it for example bit depth and sample rate ? Or Does that even factor in to it or are they only a factor when recording the audio from the synth app ?

    It's late so I hope this makes sense...

    Dave.

Sign In or Register to comment.