Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Sloppy timing in Cubasis!

I’ve noiticed that the timing in Cubasis is not tight. It seems that the playhead holds back so there is no consistent timing - like my cpu is holding back. This is VERY annoying, and hard to check by ear… It almost drives me crazy.

So I decided to record the audio coming for my headphone jack of my iPad into my laptop to see if I could SEE the timing problem in the waveform.
After that I did an audio mixdown in Cubasis of the same audio to compare..

To check the waveform I used the wave editor in Caustic: I set the song tempo to the same tempo as the project file so I could see the waveform aligned to a beat grid.

Well:

  • the audio recorded from the iPad shows timing problems
  • the audio from the mixdown shows solid timing

The audio I used to test is just a simple hi hat pattern that uses only 2% of the cpu.… But Imagine when I use AU synths, compressors and the cpu meter in Cubasis is around 65% etc. The timing wil get even worse…

What can I do, nothing is a irritating as sloppy timing!

Here's a video to show you what I mean:

  • It shows the audio recording first: sloppy
  • and from 1:27 it shows the waveform from the mixdown: solid

«13

Comments

  • edited August 2018

    I’ve noticed that on several iOS apps too. Drives me nuts too, as timing is crucial. I’m more on my laptop with Ableton since a moment, those issues and some limitations are main culprit.

  • edited August 2018

    Most iOS apps implement MIDI transport incorrectly by sending MIDI events in realtime, thus the timing stability depends on CPU load.

    There's only a handful of apps that do it correctly (sending the MIDI data in advance with nanosecond timestamps), one example is Xequence.

    An additional problem you'll face is that many MIDI RECEIVER apps (synths etc.) ALSO implement this wrongly and only receive the events in realtime, instead of in advance and then inserting the audio with sample accuracy using the timestamps, independent of system load (one example that does this correctly is Korg Gadget).

    A very unfortunate situation indeed especially for EDM, where timing is absolutely crucial.

    The only thing we can do is be vocal and repeatedly bring this to the attention of developers.

    (yes, I know I'm annoying, but I simply want this fixed for good for the benefit of all iOS music makers because as it stands, producing EDM is essentially impossible on iOS because of this except if you use exactly the right apps or stay inside a walled garden).

  • I don’t mind loose timing, what I hate is bad intonation... although sometimes that’s ok too.

  • edited August 2018

    @SevenSystems said:
    Most iOS apps implement MIDI transport incorrectly by sending MIDI events in realtime, thus the timing stability depends on CPU load.

    There's only a handful of apps that do it correctly (sending the MIDI data in advance with nanosecond timestamps), one example is Xequence.

    An additional problem you'll face is that many MIDI RECEIVER apps (synths etc.) ALSO implement this wrongly and only receive the events in realtime, instead of in advance and then inserting the audio with sample accuracy using the timestamps, independent of system load (one example that does this correctly is Korg Gadget).

    A very unfortunate situation indeed especially for EDM, where timing is absolutely crucial.

    The only thing we can do is be vocal and repeatedly bring this to the attention of developers.

    (yes, I know I'm annoying, but I simply want this fixed for good for the benefit of all iOS music makers because as it stands, producing EDM is essentially impossible on iOS because of this except if you use exactly the right apps or stay inside a walled garden).

    Again thank you for the rock solid midi timing in Xequence !!! Accurate Midi timing is absolutely essential and there are many pieces of kit which are hamstrung because of poor implementation of MIDI transport.

  • @TheOriginalPaulB said:
    I don’t mind loose timing, what I hate is bad intonation... although sometimes that’s ok too.

    Fortunately, most apps somehow manage to transfer the pitch information correctly :D

  • @SevenSystems said:
    The only thing we can do is be vocal and repeatedly bring this to the attention of developers.

    Maybe also ask Apple to improve documentation? I'm sure developers are not lazy, but Apple really does not make life easy with their minimal documentation on how to do this properly. At least that was the case some time ago when I tried to look into developing a sequencer-kind-of app. The midi event can be given timing = 0, which means handle NOW, but the proper way is to put a precise time stamp and send out (or process) the event slightly beforehand. Apple didn't give much documentation on how to properly set these things up, unless you used their own MidiSequencerTrack, but that one was full of bugs and not really useable.

    Anyway, just saying that the situation is unfortunate, but it's not only because of lazy developers. These things are tricky, and it's just much easier to set that timing to 0 and hope real-time processing will be "good enough".

    Btw, Xequence turns into an attractive buy after seeing your reply. Another app I just take for granted handles this well is AUM (because, you know). I guess AUM can't fix bad timing from other apps, but if an app sends control data to AUM internal control, then I'd be surprised if it didn't time this properly.

  • edited August 2018

    @bleep said:

    @SevenSystems said:
    The only thing we can do is be vocal and repeatedly bring this to the attention of developers.

    Maybe also ask Apple to improve documentation? I'm sure developers are not lazy, but Apple really does not make life easy with their minimal documentation on how to do this properly. At least that was the case some time ago when I tried to look into developing a sequencer-kind-of app. The midi event can be given timing = 0, which means handle NOW, but the proper way is to put a precise time stamp and send out (or process) the event slightly beforehand. Apple didn't give much documentation on how to properly set these things up, unless you used their own MidiSequencerTrack, but that one was full of bugs and not really useable.

    Anyway, just saying that the situation is unfortunate, but it's not only because of lazy developers. These things are tricky, and it's just much easier to set that timing to 0 and hope real-time processing will be "good enough".

    Btw, Xequence turns into an attractive buy after seeing your reply. Another app I just take for granted handles this well is AUM (because, you know). I guess AUM can't fix bad timing from other apps, but if an app sends control data to AUM internal control, then I'd be surprised if it didn't time this properly.

    Absolutely in agreement, the CoreMIDI documentation is abysmal. Especially regarding the "big picture". I found out most stuff by trial and error, sped up a bit by my "general understanding" of buffered queues and my almost sick desire to get tight timing (a hi hat off by one millisecond for me feels like someone dropping a hammer on my head!).

    What Xequence does is (simplified) -- have a configurable "MIDI Pre-Roll" setting (defaults to 300 ms). Then, a timer runs every 300 ms and checks if at least the next 600 ms of MIDI data are queued.

    Of course when the user stops the transport or does edits during playback, you have to unqueue all events using MIDIFlushOutput (and for edits, tag those events as unqueued in your queue implementation, and then call the "queue checker" immediately so that the new data gets inserted right away). Interestingly, MIDIFlushOutput also seems to send [All] Note Offs, so you don't have to do it yourself (another undocumented thing)...

  • edited August 2018

    Hi thanx for your input...

    I did some more testing.. When I sequence the same hi hat pattern in Caustic and check the waveform it is rock solid.

    So, yeah I basically bought Cubasis for nothing... I just can't work with a sequencer that can't get the timing right. That's really an essential feature for a sequencer. And in my example I didn't even use a AU synth or effect.. Just Cubasis' own Mini Sampler. I get tired of chasing bugs, I just want something that works..

    So, will Xequence + AUM be the right combo for solid timing?

  • edited August 2018

    @SlowwFloww Xequence + AUM is a good combination in general. The timing depends on if the receiving apps have the "correct" implementation. Many do, some don't. It's best to try.

    The one "multi-timbral all-in-one production combination" that I have extensively tested to be super-tight and hassle-free is Xequence + Korg Gadget. (No I'm not a Korg employee :D)

  • tjatja
    edited August 2018

    @SevenSystems What you wrote sounds very interesting.
    And as I wrote elsewhere, I think that developers need to increase pressure on Apple.
    If the Apple developer forum would be full of complains and requests, Apple would need to fix and document things finally.

    Just accepting things as they are and thinking that nothing will change, will indeed lead to no changes.

    Get a voice and use it, only this way things can get better.

    My 2 cent...

    And about badly programmed Apps, how can you check this?
    Can you give an example how to detect such bad MIDI timings?

  • @SlowwFloww said:
    Hi thanx for your input...

    I did some more testing.. When I sequence the same hi hat pattern in Caustic and check the waveform it is rock solid.

    So, yeah I basically bought Cubasis for nothing... I just can't work with a sequencer that can't get the timing right. That's really an essential feature for a sequencer. And in my example I didn't even use a AU synth or effect.. Just Cubasis' own Mini Sampler. I get tired of chasing bugs, I just want something that works..

    So, will Xequence + AUM be the right combo for solid timing?

    Trying to get your attention, @LFS
    Also, there was this point about the reproducible longer WAV files from Cubasis.
    You wanted to get this checked / fixed....

  • edited August 2018

    @tja said:
    And about badly programmed Apps, how can you check this?
    Can you give an example how to detect such bad MIDI timings?

    Put a 16th Hihat line in Xequence, target the app, and record the audio. If each hihat is not exactly on time and equidistant to the previous hihat, the app (or one of the intermediate apps) is broken :)

    Depending on the person, the timing is actually bad enough that it's not necessary to "measure" anything though, you simply hear it.

  • @tja said:
    @SevenSystems What you wrote sounds very interesting.
    And as I wrote elsewhere, I think that developers need to increase pressure on Apple.
    If the Apple developer forum would be full of complains and requests, Apple would need to fix and document things finally.

    Just accepting things as they are and thinking that nothing will change, will indeed lead to no changes.

    Get a voice and use it, only this way things can get better.

    My 2 cent...

    And about badly programmed Apps, how can you check this?
    Can you give an example how to detect such bad MIDI timings?

    Yes I agree developers should stand together and ask Apple for better documentation, it's ridiculous after all these years and not as if Apple lack resources to improve things.

  • @SevenSystems said:

    @tja said:
    And about badly programmed Apps, how can you check this?
    Can you give an example how to detect such bad MIDI timings?

    Put a 16th Hihat line in Xequence, target the app, and record the audio. If each hihat is not exactly on time and equidistant to the previous hihat, the app (or one of the intermediate apps) is broken :)

    Depending on the person, the timing is actually bad enough that it's not necessary to "measure" anything though, you simply hear it.

    I tried.
    With bs-16i

    And as I need a host, I used AUM for this and recorded to AudioShare.

    But in AudioShare I cannot easily see if the timing is right.
    From hearing, it seems OK.
    But then, I did not create some CPU load.

    Do you have an example of a Synth that is problematic?
    Per PM, if you prefer

  • edited August 2018

    Sloppy vs. Perfect (clearly audible for me without any kind of waveform editor).

    It gets even worse when the synth is in the foreground, probably because it will then be updating its UI, which uses additional CPU.

    (please ignore the "apeMatrix thru AB" bit in the instrument name, that's from a previous test... this is directly through CoreMIDI)

  • I cannot read the Synth names, please send me one bad....

    I need a visual check, I fear.

  • @SevenSystems said:
    Sloppy vs. Perfect (clearly audible for me without any kind of waveform editor).

    It gets even worse when the synth is in the foreground, probably because it will then be updating its UI, which uses additional CPU.

    (please ignore the "apeMatrix thru AB" bit in the instrument name, that's from a previous test... this is directly through CoreMIDI)

    Sloppy midi timing the scourge of electronic music!

  • edited August 2018

    At the top you see a close up in Ableton of my audio recording from my iPad's u22 audio interface.. I made a 16th note hi hat pattern... As you can see the timing is off... The only thing that is playing is 1 track with a hi hat sample in the Mini Sampler....

    At the bottom you see a waveform from a Mixdown.. solid timing..

    The settings I use in Cubasis are:
    polyphony: 24
    hardware latency: low
    large recording buffer: on
    iPad flight mode: on

    @LFS What can I do to get solid timing in Cubasis?

  • I don't know maybe we've hit the 'timing roof' with Cubasis and it's somewhat dated 48ppqn resolution?!

  • @Samu nah it's not about resolution... but yeah, 48 PPQ isn't too much either when you want, say, smooth controller ramps :)

  • @SevenSystems said:
    @Samu nah it's not about resolution... but yeah, 48 PPQ isn't too much either when you want, say, smooth controller ramps :)

    Could it then be caused by voice stealing?(Polyphony setting in Cubasis Audio Tab?).

    Or simply the fact that internally Cubasis doesn't send proper timing information to the internal instruments so the 'output' from them arrives to the next available buffer?!

  • edited August 2018

    I think it's just a cpu problem... like @SevenSystems said: 'Most iOS apps implement MIDI transport incorrectly by sending MIDI events in realtime, thus the timing stability depends on CPU load.'

  • @SlowwFloww said:
    I think it's just a cpu problem... like @SevenSystems said: 'Most iOS apps implement MIDI transport incorrectly by sending MIDI events in realtime, thus the timing stability depends on CPU load.'

    Brings me back to when days of the Amiga and Atari and why the more expensive computers (Mac's at the time) got more solid midi-timing due to higher CPU speeds :D

  • edited August 2018

    @Samu @SlowwFloww It's not a CPU problem, it's an implementation problem. Cubasis, like many other apps, doesn't send / process timing information along with the events.

    You can have a 72836282873 GHz 512-Core System and you still might encounter situations where realtime transfer of events is not accurate. That's why CoreMIDI, CoreAudio et al provide timestamps so that each app can tell the other apps exactly when an event is supposed to occur, AHEAD OF TIME. But I've probably annoyed everyone enough with this now ;)

    Another way to put it: Most apps use CoreMIDI as if it were a real-time system. But it's really more of an "offline" system, much like MIDI files.

  • I brought this up a long time ago and it's the reason I deleted it. I could hear it a mile off.

  • @SevenSystems said:

    You can have a 72836282873 GHz 512-Core System and you still might encounter situations where realtime transfer of events is not accurate. That's why CoreMIDI, CoreAudio et al provide timestamps so that each app can tell the other apps exactly when an event is supposed to occur, AHEAD OF TIME. But I've probably annoyed everyone enough with this now ;)

    When it's 'real-time' no one knows for sure what's 'ahead of time' and maybe that's why Xequence skips playback of the note that is recorded in a loop and 'catches it' on the next trip around ;) (Meaning If I do quantised recording the note is there on the screen but isn't played back on the first pass but on the second pass).

  • edited August 2018

    @Samu: It doesn't have to be totally realtime, but when composing & arranging a song, assuming the next 300 milliseconds (the Xequence default) to be "fixed" is normally good enough. I can live with that (but yes, you observed that correctly, that's why Xequence doesn't "catch" the first note of the loop while recording. But that's fixable, I just didn't come around to doing it yet). It's certainly preferable to having your drums all over the place. Most people don't make EDM so they don't notice. But believe me, if you get the kind of jitter like in the audio waveform @SlowwFloww posted above, you can't really do anything "serious" with such a production. i.e. if you publish it, people will notice it. They won't notice it like "oooohhh, the apps that were used to make that song had sloppy timing!!!". They'll say "errr I dunno, I hate this song. Next." :)

    EDIT: OK, the problem goes away in Mixdown, at least for the internal Cubasis synths. But it's also a vibe killer during production.

  • edited August 2018

    @SevenSystems Thanks so much for the clear explanation of this midi timing stuff!

    Okay, last test.. I placed 4 recordings in Ableton (I tried to align them properly). iPad was in flight mode, no other app active...

    Track 1: Cubasis recorded from the iPad's audio interface to my laptop
    Track 2 Cubasis Mixdown
    Track 3: Caustic recorded from the iPad's audio interface to my laptop
    Track 4: Caustic Mixdown

  • edited August 2018

    EDIT: OK, the problem goes away in Mixdown, at least for the internal Cubasis synths. But it's also a vibe killer during production.

    True! VERY annoying...

  • @SevenSystems said:
    They won't notice it like "oooohhh, the apps that were used to make that song had sloppy timing!!!". They'll say "errr I dunno, I hate this song. Next." :)

    I know :D

    Back in the days there was a 'fight' between Notator/Logic and Cubase on which of them had tighter timing and Emagic/Notator/Logic etc. always came out on top. I do wonder if that's still the case :D

    EDIT: OK, the problem goes away in Mixdown, at least for the internal Cubasis synths. But it's also a vibe killer during production.

    Yepp, and it took a long time until 'freeze' and 'mixdown' the the timing got somewhat accurate and now that the 'host' accepts time-stamps to do more accurate rendering it's up to all the older synth apps to get updated, yeah like that's going to happen in a near future or something...(Still we have the 'issues' with synth apps being optimised for certain sample-rates and that causes havoc if the host is running at another sample-rate than what the apps are optimised for resulting in crackle-town, out of tune etc. etc.).

Sign In or Register to comment.