Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

AUM as a mixer

Probably a dumb rookie question, but...

If I had a real audio interface (instead of this counterfeit guitar iRig cable), could I use AUM as mixer for real instruments? I would want to have a vocal mic and at least two physical instruments and mix them live while performing.

Comments

  • edited December 2017

    @wellingtonCres said:
    yes

    Thanks -- now you've got me excited. What would I need in the way of an interface to make this happen?

    Ideally I would like for one of my physical instruments to go into AUM, get mixed, and then sent back out into the input of another hardware synth.

  • you need an audio interface with as many inputs and outputs as you want to use at the same time then you are off to the races

  • I use a Focusrite Scarlett 6i6 for shows and mobile stuff. I was pretty surprised when AUM recognized it. Just popped up like it would in a legit DAW. It really is an amazing workhorse.

    The 6i6 has two mic/inst ins on the front, two line ins on the back, and four outs. It even has two headphone monitors and s/pdif for expansions. You could run your inst in through AUM, out through one out into a synth, back through the line in into another channel in AUM. You could even create a bus out in AUM that goes to one of the line outs to your synth to mix it with inst like the pic below, or have it on another channel with the same input going to a different output. So many possibilities!

  • For me, the main issue, while this does work, is latency. Some apps don’t work well on lower buffer settings.

  • @Fitz said:
    For me, the main issue, while this does work, is latency. Some apps don’t work well on lower buffer settings.

    Yeah, I was going to mention this. Since you're effectively monitoring all the channels you're dealing with round-trip latency. Some interfaces have effectively remote control software to do internal mixing where the audio stays local to the interface for "zero latency monitoring", but that would be outside of AUM.

  • Thanks guys, I am very excited now. With the droning ambient stuff I think latency isn't really as much of a problem.

  • @Wrlds2ndBstGeoshredr said:

    @wellingtonCres said:
    yes

    Thanks -- now you've got me excited. What would I need in the way of an interface to make this happen?

    Ideally I would like for one of my physical instruments to go into AUM, get mixed, and then sent back out into the input of another hardware synth.

    Can be easily done just watch out for latency.

  • @Fitz said:
    For me, the main issue, while this does work, is latency. Some apps don’t work well on lower buffer settings.

    Lower buffer means that the 1's and 0's that go to the cpu's buffer are cut to shorter sequences. This means that the cpu needs to process more of shorted sequences, this creates more strain on the cpu, but also cuts down latency. If an app requires more cpu power, it will not ofc work with lower buffer sizes without cpu starting to create crackles and stuff to audio, as your cpu is maxing out all the time.
    You need to find the sweet spot with the buffer size based on how cpu heavy tasks you are doing and if you want to use more cpu heavy apps with lower buffers(= lower latency), you need more powerful cpu. Starting with lower buffer and then raising it as the project grows is a good idea, if you want to keep latency as small as possible. Also adding cpu heavy effects is good to be left when you have your project in such a place where you dont need to play in notes in real time = when latency doesent matter.
    Developers need to find sweet spots to making apps cpu efficient without compromising on sound quality. Some devs are better at writing efficient code than others. Apps with unnecessary complex(cpu heavy) code are poorly made, and you might want to look for apps that are coded better if you feel like some simple app takes too much cpu. Many apps on the other hand are so complex and offer high quality sound(like model 15 for example) that even if the code is well made, it will still task your cpu heavily.

Sign In or Register to comment.