Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Sample Rate Poll

So, just a question about the options open to me, since my audio interface supports up to 96 kHz. The main reason I started looking into this was because of latency. The higher sample rates also mean lower latency AND better audio quality. So running at 44.1kHz with a 126 buffer size is an equivalent latency to running at 88.2kHz with a 256 buffer size, but the audio quality is better, and there are different demands on the CPU.

Currently, it seems my apps and/or Audiobus default to 44.1kHz. So, it seems the higher sample rates are not really an option yet. I’ve tried using AUM to control for this.... and it works when I build the session, but as for saving presets, and opening it all up again, everything seems to revert to 44.1kHz.

Now I’m not saying that 44.1kHhz isn’t good enough. With a low enough buffer size, I can get the latency down to something I personally feel I can work well with (less than 10ms roundtrip). I’m just wondering if anyone else is using anything other than 44.1kHz in their ongoing setup that includes Audiobus.

44kHz Versus Higher Sample Rate
  1. What sample rate do you use in your regular iOS setup?47 votes
    1. I’m using 44 kHz
      78.72%
    2. I’m using a sample rate higher than 44 kHz
      21.28%
«1

Comments

  • I feel that iOS support for anything other than 44.1 kHz and 16 bit is dicey. I don’t usually want to risk having things go all wrong when I am trying to get a final file render out. I feel like there are enough things that can go wrong when using a lot of apps and a USB audio device that I don’t want to raise the potential for more chaos and unpredictability.

  • I do both in Auria. Unfortunately the dudes I work with start with 48 k projects and then send me the stems. I used to run into a lot of problems with IAA synths & effects and their processing ability at 48k, but things are much better now if I stick to AUs.

    Things have gotten better, but overall 44.1/16 bit is the safest (and intended?) way.

  • edited March 2018

    Latency is a product of sample rate, AND processing power. Upping the sample rate will lower the latency, but dramatically increases the load on the processor- at 96khz it has to do more than twice as much work for every millisecond of audio. If it can't finish the work in time to spit the sample out, it crackles. For any given processor, you get lower latency at 44.1khz, because the processor does less work, and you can then adjust the latency lower, than you would, at 96k, before you get crackles.

    Higher sample rates are only for a) when it makes a noticeable difference to you, b) the processor can keep up, c) the extra latency you must add to avoid crackles is acceptable. Its kind of a big engine, "processor power to burn" setting.

    That being said, synth waveform generation, and effects that add harmonics are an area where higher sample rates can improve audio quality, by having less nyquist aliasing, but, depending on the programmer's skill at dsp, they may have created an audio engine that generates nice sounding harmonics/little aliasing at lower sample rates, and upping the sample rate does nothing, to improve aliasing, because the quality was already good.

  • @Processaurus said:
    Latency is a product of sample rate, AND processing power. Upping the sample rate will lower the latency, but dramatically increases the load on the processor- at 96khz it has to do more than twice as much work for every millisecond of audio. If it can't finish the work in time to spit the sample out, it crackles. For any given processor, you get lower latency at 44.1khz, because the processor does less work, and you can then adjust the latency lower, than you would, at 96k, before you get crackles.

    Higher sample rates are only for a) when it makes a noticeable difference to you, b) the processor can keep up, c) the extra latency you must add to avoid crackles is acceptable. Its kind of a big engine, "processor power to burn" setting.

    That being said, synth waveform generation, and effects that add harmonics are an area where higher sample rates can improve audio quality, by having less nyquist aliasing, but, depending on the programmer's skill at dsp, they may have created an audio engine that generates nice sounding harmonics/little aliasing at lower sample rates, and upping the sample rate does nothing, to improve aliasing, because the quality was already good.

    Knowledge! Thanks for enlightening. :)

  • Auria defaults at 24bit I thought

  • @oat_phipps said:

    Things have gotten better, but overall 44.1/16 bit is the safest (and intended?) way.

    Works for me most of the time :)

    There are issues with apps/AUv3 that do not 'check' for the hosts sample-rate and adapt to it causing chaos when freezing audio or in some causes make things sound out of tune (ie. feeding a 48Khz to 44.1Khz host without sample-rate conversion etc. etc.).

    Latest PITA for me is the awesome new Model D from Moog that just simply refuses to do correct audio freeze at [email protected] without mayhem. If I set the session sample rate to 24-bit@96Khz freezing somehow works but things are not in sync. Ironically enough it works perfectly in BM3 at what ever sample-rate/bit-depth I choose so it is a Cubasis issue...

  • @Samu said:

    @oat_phipps said:

    Things have gotten better, but overall 44.1/16 bit is the safest (and intended?) way.

    Works for me most of the time :)

    There are issues with apps/AUv3 that do not 'check' for the hosts sample-rate and adapt to it causing chaos when freezing audio or in some causes make things sound out of tune (ie. feeding a 48Khz to 44.1Khz host without sample-rate conversion etc. etc.).

    Latest PITA for me is the awesome new Model D from Moog that just simply refuses to do correct audio freeze at [email protected] without mayhem. If I set the session sample rate to 24-bit@96Khz freezing somehow works but things are not in sync. Ironically enough it works perfectly in BM3 at what ever sample-rate/bit-depth I choose so it is a Cubasis issue...

    Before you attempt to freeze your Model D Track in Cubasis, temporarily switch it from AU to IAA mode by reloading the instrument and applying the same preset. You can freeze in 44.1kHz 16 bit if you use the IAA version, so it is most likely more AU confusion. I tried the new options, as well as piping in through AUM, but ended up with silence in the frozen AU track. IAA worked good though.

  • I didn’t check to see if AU parameter automation survives the switching of instruments, so proceed with caution. :#

  • @CracklePot said:
    I didn’t check to see if AU parameter automation survives the switching of instruments, so proceed with caution. :#

    It doesn't :(

    I hope Steinberg and Moog sort this out asap!

    Some old bug must have crept back in to Cubasis 2.4 as I'm having the same issues with Model 15 and it used to work perfectly with the previous version of Cubasis...

  • edited March 2018

    I had similar questions awhile back. Did some tests and personally found the quality difference at 96k from 44.1k noticeably better, but not much of a jump from 48k to 96k. So I try to mostly mess around at 48k.

  • @NoiseHorse said:
    Auria defaults at 24bit I thought

    Auria projects are all 24bit by default ad I've never run into any issues. GarageBand also supports 24bit without problems. But obviously that's completely different to the sample rate issue.

  • 96k is better but noticeable twice better? In certain circumstances maybe but it's twice the processing power needed and twice the file size, of 48k.

  • @Samu said:

    @CracklePot said:
    I didn’t check to see if AU parameter automation survives the switching of instruments, so proceed with caution. :#

    It doesn't :(

    I hope Steinberg and Moog sort this out asap!

    Some old bug must have crept back in to Cubasis 2.4 as I'm having the same issues with Model 15 and it used to work perfectly with the previous version of Cubasis...

    Shazzbat. I kind of think Moog compensated for the pre-update Cubasis in their AU approach, while Steinberg adjusted the way Cubasis handles AU. Now we have a chasing the tail type of situation. :D

  • edited March 2018

    @CracklePot said:

    Shazzbat. I kind of think Moog compensated for the pre-update Cubasis in their AU approach, while Steinberg adjusted the way Cubasis handles AU. Now we have a chasing the tail type of situation. :D

    Could be but I'm too tired (and feeling like WTF!) to write a proper bug-report :D

    Things like track-freezing at all supported project sample-rates should be properly verified before an update is pushed out to the AppStore. Oh we'll maybe we'll get Cubasis 2.4.1 soonish together with updated Moog apps and when that happens we might be looking at iOS11.4 which is supposed to fix the AUv3 issues that are Apple's fault.

    Today I've been busy reporting bugs to Retronyms in their iMPC Pro 2 app. If the 'program list' contains more than 10 items the last item can not be selected as the UI has a 'spring function' that hides it...

  • @Samu Well, you are awesome! :)
    Thank you for doing all you do to help keep the iOS music experience from being a total train wreck. B)

  • 44.1kHz is higher than 44kHz last time I looked.

    By the way, anyone working with video to any professional level is going to ignore 44.1kHz and use 48kHz, which almost all proper video playout will assume.

  • Mostly 44.1Khz 24bit.
    But i might get more to 48Khz 24bit since i might get more into music for media and stuff.

  • 24/44.1 here if it's for music. 24/48 for video.

  • @Tarekith said:
    24/44.1 here if it's for music. 24/48 for video.

    This. For life.

  • edited March 2018

    By the way, don’t get too hung up on 24bit. The truth is, 16 bit is more than adequate for both input and final out. With 16 bit, you can ignore about 2 bits approx, because they’re below the noise floor. With 24 bits, you can ignore about 10 bits because they’re all below the noise floor.

    Where 24 bits comes in handy is in internal and intermediate processing. If you’re doing something that requires a multiply operation, for example, you’re going to want maximum internal bit depth to do the multiply with, to prevent quantisation effects showing up. Once all that is done and over with, you can settle at 16 bit final output and everyone would be happy enough. Similarly, if your inputs to the process were 16 bit, you’re good enough. It is just at the internal processing stage you’d want more, temporarily.

  • @Samu said:

    we might be looking at iOS11.4 which is supposed to fix the AUv3 issues that are Apple's fault.

    where did you hear about this?

  • @realdavidai said:

    we might be looking at iOS11.4 which is supposed to fix the AUv3 issues that are Apple's fault.

    where did you hear about this?

    It's been mentioned over at the Steinberg forums that AUv3 on iOS have issue that Apple needs to address regarding off-line rendering (which is used for freezing) of CPU heavy plug-ins.

    Currently the only Plug-Ins that fail to render for me with Cubasis 2.4 are Model 15 & Model D. Model 15 used to work with the previous version of Cubasis after Moog made a fix. Apparently that 'fix' is no longer present in Cubasis 2.4. I have no issues with audio recording of Model 15 or Model D with BeatMaker 3...

  • 48k 24 bit

    • never 44.1k
  • @Mayo said:
    48k 24 bit

    • never 44.1k

    Only as long as the device natively supports it without up/re-sampling :)

  • @u0421793 said:
    By the way, don’t get too hung up on 24bit. The truth is, 16 bit is more than adequate for both input and final out. With 16 bit, you can ignore about 2 bits approx, because they’re below the noise floor. With 24 bits, you can ignore about 10 bits because they’re all below the noise floor.

    Where 24 bits comes in handy is in internal and intermediate processing. If you’re doing something that requires a multiply operation, for example, you’re going to want maximum internal bit depth to do the multiply with, to prevent quantisation effects showing up. Once all that is done and over with, you can settle at 16 bit final output and everyone would be happy enough. Similarly, if your inputs to the process were 16 bit, you’re good enough. It is just at the internal processing stage you’d want more, temporarily.

    This.

  • edited April 2018
    The user and all related content has been deleted.
  • @Samu said:

    @Mayo said:
    48k 24 bit

    • never 44.1k

    Only as long as the device natively supports it without up/re-sampling :)

    I'm still a DAW guy, IOS is just another musical instrument for me to be tracked into my DAW.
    So 48k 24bit is my minimum - and some albums I still do at 96khz.

  • @Processaurus said:
    Latency is a product of sample rate, AND processing power

    Isn’t this a little misleading? Processing power is a factor for sure (for tolerable audio), but if you have a faster device, the latency rate at 44.1kHz/16bit/256frames is the same as a slow device at 44.1kHz/16bit/256frames is it not? Correct me if I’m wrong.

    I might start another forum post for this, but, anyone have any idea if RAM greatly affects a stable low latency setup? Or is this mainly a CPU issue?

  • I’m not sure what Gadget uses, but I certainly use 96kHz in Auria when mixing stems and mastering the final product. Now, while you can’t actually hear the difference between 44kHz and 96kHz, various effects such as EQing, MB compression, etc come off more natural/musical. It’s the difference between using student-grade Artist’s Loft oil paints on a cheap canvas and Old Holland Classics on a hand-crafted canvas.

  • edited April 2018

    Ok so there is something I definitely don't understand about my basic setup. Now I'm getting LOWER cpu usage with a HIGHER sample rate. I'm using just a test setup here, not a full session, which includes my Zoom U-44 audio interface, iPad6, AUM (set up so that the audio input goes straight to output), and a CPU monitor app. Could this be due to a preference of the audio interface? It states that it runs at many different sample rates... maybe there's a default sample rate and the other sample rates are converted from that? Any ideas here?

Sign In or Register to comment.