Latency optimization questions

I am largely ignorant about minimizing latency on the iPad or anywhere really. I have long either recorded live into a computer and used direct monitoring or been able to have it easily down to a few ms that i cant perceive simply by setting a low buffer setting.

I’ll describe my specific setup below but I am also happy to hear your general tips on latency. I searched threads here but the answers were more about audio v audio, and my issue is more midi and audio. Beyond setting a low sample rate, I have little direction.

My specific issue is getting this setup all working optimally:

Connected to iPad via usb hub:
*Akai mpk225,
*Moog Sirin via iConnectivity Mio, (easier that way to hook up to laptop for deeper editing v. Usb)
*Minilogue XD, and
*Focusrite 2i2.

With a buffer setting of 256, problem comes when I try to use the Akai arpeggiator on the Sirin. There is a bit of delay, not horrible but it bugs me.

So to do that, I am sending clock out to the Akai, midi notes back into the iPad, thru Modstep, to the Sirin.

In contrast, sending clock to the Minilogue, playing there, audio coming from it itself of course, seems to not be problematic.

So I guess I should try just connecting the Akai directly to the Sirin, but I want to use the Akai to also play iPad synths.

Is there any reason to think a combined audio and midi interface would lower latency versus my audio interface and separate usb to 5 pin Mio adapter? E.g Focusrite 2i4? Are there round-trip issues or it is all parallel regardless?

Would a usb3 interface be faster through a usb3 hub? E.g.

https://www.zoom.co.jp/products/audio-converter/uac-2-usb-30-audio-converter

«1

Comments

  • There is very little discussion on latency with iOS. I'm glad you made this topic as I'd like to hear from somebody that knows about how it works.

    From my own experiments you can get audio latency down to 5ms or lower with a 64 samples/96khz setting but not many apps support that at all. At 128 samples and 44k it's around 12ms round trip but it does seem to vary.

    With regards to MIDI latency/jitter/clock all that, again it varies with iOS.

    If I go Rozeta in AUM>MIDI out via audio interface>Minitaur MIDI input I get very little latency/jitter or drift.

    In the scenario where I am sending MIDI clock out to hardware from the iPad via my audio interface or a MIDI interface the results are not good. MIDI clock on iOS varies a lot from app to app.

  • Just to add also. I've achieved sync as tight as an MPC60 which is slightly better than a Cirklon with BM3 sending CC# messages via my USB audio interface that are converted to gate/clock with modular.

  • edited May 13

    @BroCoast said:
    There is very little discussion on latency with iOS. I'm glad you made this topic as I'd like to hear from somebody that knows about how it works.

    From my own experiments you can get audio latency down to 5ms or lower with a 64 samples/96khz setting but not many apps support that at all. At 128 samples and 44k it's around 12ms round trip but it does seem to vary.

    With regards to MIDI latency/jitter/clock all that, again it varies with iOS.

    If I go Rozeta in AUM>MIDI out via audio interface>Minitaur MIDI input I get very little latency/jitter or drift.

    In the scenario where I am sending MIDI clock out to hardware from the iPad via my audio interface or a MIDI interface the results are not good. MIDI clock on iOS varies a lot from app to app.

    Interesting, ya, i guess why I know so little about it, is because it has all been so simple on desktop over the years. I’ve just connected physically and virtually and they’ve worked. Similar with straight hardware.

  • No replies but hey I kept experimenting. I am getting tighter results with adding Step Poly Arp syncd via Ableton Link for the Sirin.

  • @Multicellular said:
    No replies but hey I kept experimenting. I am getting tighter results with adding Step Poly Arp syncd via Ableton Link for the Sirin.

    That should be a lot better. Ableton Link is quite good these days.

    When syncing apps it seems to be a consistent 0.21ms latency so it can be nudged to be perfect. I haven't tried using it and MIDI controlling an external instrument for a while, I'll have to check that out today.

    I'm not surprised by the no replies. :lol:

  • edited May 14

    Ok very interesting results.

    I used Fugue Machine synced to AUM with Ableton Link.
    MIDI out to Minitaur via Steinberg UR22 audio interface.
    File recorded into Audioshare
    Loaded into Cubasis for inspection

    Around 0.15 ms consistent latency, no measurable jitter.

    Added Patterning 2 (Link sync'd) into AUM & recorded the same sequence and the latency increases. So it seems to be load dependant.

    I am so confused now. :lol:

    Disclaimer: This is Bro Science.

  • edited May 14

    Thanks for posting this thread - I was hoping someone knowledgeable would chime in and enlighten us.

    Here’s some of my thoughts... and pieces of old knowledge

    • old School physical MIDI itself is slow - (32k bits per second?) - and it is serial transmission so each note on or pitch bend or cc is sent one after the other. So if you try and play a 10 note chord all at once the notes never come out all at the same time. If you were using pitch bend and cc whilst playing and are running 16 different instruments on 16 different channels down a single MIDI cable timing used to get quite sketchy.

    • multi output midi interfaces helped this by reducing the bottleneck.

    • daws are able to record the midi at high precision.

    • when daws playback recorded MIDI they are able to buffer the notes so they all come out when they should simultaneously

    • if the DAW is playing a virtual hosted instrument then it is possible to send the notes in advance so that any latency in the plug in/ instrument can be compensated for. It is possible in some cases that each note sent in advance to the plugin/instrument with a time stamp attached so it will be played at the right time.

    • if the DAW is sending to a external MIDI device down a physical this latency compensation is impossible and could only be set manually

    • MIDI from a hardware sequencer / arpeggiator or actual human playing comes into the DAW “live” - so the DAW could not possibly apply any buffering or time syncing

    (That’s all I can think of right now)
    ....

    All these things come into play when connecting various pieces of gear together.

  • edited May 15

    For what it's worth, this post got me curious about the subject, and ultimately it ended with me reading several forum threads regarding usb midi cables versus actual midi cables.

    I know it’s not specifically what was asked, but it seems like this thread could be a good resource for the wiki if we get some more info. Anyway, it seems most people consider physical midi cords to be superior in the majority of cases, especially live performance, though they do suffer from the bottle neck mentioned above. USB is much faster, but apparently is more likely to add jitter. This is based on the majority opinion, Wikipedia apparently amended their article regarding it and the official midi site didn’t make things much clearer.

    Would definitely be curious to hear from an expert on the subject, although it definitely seems theres a lot of disagreement as far as the best method, with a lot of it coming down to specific hardware and software. As far as iOS, perhaps someone like @j_liljedahl, @brambos or @SevenSystems might be able to shed some light on things?

  • @Thardus said:
    For what it's worth, this post got me curious about the subject, and ultimately it ended with me reading several forum threads regarding usb midi cables versus actual midi cables.

    I know it’s not specifically what was asked, but it seems like this thread could be a good resource for the wiki if we get some more info. Anyway, it seems most people consider physical midi cords to be superior in the majority of cases, especially live performance, though they do suffer from the bottle neck mentioned above. USB is much faster, but apparently is more likely to add jitter. This is based on the majority opinion, Wikipedia apparently amended their article regarding it and the official midi site didn’t make things much clearer.

    Would definitely be curious to hear from an expert on the subject, although it definitely seems theres a lot of disagreement as far as the best method, with a lot of it coming down to specific hardware and software. As far as iOS, perhaps someone like @j_liljedahl, @brambos or @SevenSystems might be able to shed some light on things?

    This might be useful: https://forum.audiob.us/discussion/32874/au-and-iaa-taxonomy

    It seems there is some unknown factors to iOS regarding core midi and core audio. Michael seems to know a bit about it all and has some solutions I believe.

    Link is definitely tighter than MIDI clock, at least for every app I have that supports MIDI clock.

  • edited May 15

    I can't say anything with absolute certainty, but if you use a MIDI interface with "real" MIDI ports that connects to the host via USB, then you have TWO factors introducing problems: the USB<>Interface connection which is not realtime-safe (jitter), AND the Interface<>Device connection over MIDI, which has the usual limited bandwidth, so gets more latency the more events are sent at once.

    To me it would seem that as soon as USB is involved in any way, you're screwed.

    My last two "serious" USB interfaces when I still had a hardware studio connected over the parallel and serial ports (things that don't even exist anymore today), to which software had very low-level, protocol-less access. So no problems.

    So in the end, if your only option is USB, I'd say that "direct" MIDI over USB is the best option.

    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device or app.

  • edited May 15

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

  • edited May 15

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    EDITED my post. Note though that even apps can choose to receive MIDI events in advance and so my statement basically holds... the target device (or app!) is responsible for timing.

  • edited May 15

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

  • edited May 15

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

    Right! And because that always led to problems especially with slow hardware and non-realtime-safe operating systems, companies like emagic came up with stuff like Logic and the Unitor 8, what I described above. That combo was the first time ever I even experienced absolutely spot on timing, and that has converted me forever into the "timing Nazi" I've become ;)

    But again yes, you're right of course about the barebones MIDI protocol itself. I was kind of talking about the bigger picture.

    (btw: it seems like even back then, they hit a nerve with that innovation... the Unitor 8 was "just" an 8 port in/out MIDI interface and cost a mere $1000 or so IIRC! So people apparently were in the market for good timing :))

  • edited May 15

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

    Right! And because that always led to problems especially with slow hardware and non-realtime-safe operating systems, companies like emagic came up with stuff like Logic and the Unitor 8, what I described above. That combo was the first time ever I even experienced absolutely spot on timing, and that has converted me forever into the "timing Nazi" I've become ;)

    But again yes, you're right of course about the barebones MIDI protocol itself. I was kind of talking about the bigger picture.

    (btw: it seems like even back then, they hit a nerve with that innovation... the Unitor 8 was "just" an 8 port in/out MIDI interface and cost a mere $1000 or so IIRC! So people apparently were in the market for good timing :))

    Now I understand what you mean, but I wouldn't expect even a tiny minority of users to have such exotic in-between-hardware. Especially when hooking up iOS devices to hardware setups. If you send MIDI into hardware, you can have all the timestamping you want but no sequencer-implementation will ever save you from the jitter inherent in the realtimeness of the technology.

  • @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

    Right! And because that always led to problems especially with slow hardware and non-realtime-safe operating systems, companies like emagic came up with stuff like Logic and the Unitor 8, what I described above. That combo was the first time ever I even experienced absolutely spot on timing, and that has converted me forever into the "timing Nazi" I've become ;)

    But again yes, you're right of course about the barebones MIDI protocol itself. I was kind of talking about the bigger picture.

    (btw: it seems like even back then, they hit a nerve with that innovation... the Unitor 8 was "just" an 8 port in/out MIDI interface and cost a mere $1000 or so IIRC! So people apparently were in the market for good timing :))

    Now I understand what you mean, but I wouldn't expect even a tiny minority of users to have such exotic in-between-hardware. Especially when hooking up iOS devices to hardware setups. If you send MIDI into hardware, you can have all the timestamping you want but no sequencer-implementation will ever save you from the jitter inherent in the realtimeness of the technology.

    Yep, sorry for taking the thread away... sometimes I enjoy indulging in my strangeness ;)

  • @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

    Right! And because that always led to problems especially with slow hardware and non-realtime-safe operating systems, companies like emagic came up with stuff like Logic and the Unitor 8, what I described above. That combo was the first time ever I even experienced absolutely spot on timing, and that has converted me forever into the "timing Nazi" I've become ;)

    But again yes, you're right of course about the barebones MIDI protocol itself. I was kind of talking about the bigger picture.

    (btw: it seems like even back then, they hit a nerve with that innovation... the Unitor 8 was "just" an 8 port in/out MIDI interface and cost a mere $1000 or so IIRC! So people apparently were in the market for good timing :))

    Now I understand what you mean, but I wouldn't expect even a tiny minority of users to have such exotic in-between-hardware. Especially when hooking up iOS devices to hardware setups. If you send MIDI into hardware, you can have all the timestamping you want but no sequencer-implementation will ever save you from the jitter inherent in the realtimeness of the technology.

    Yep, sorry for taking the thread away... sometimes I enjoy indulging in my strangeness ;)

    Naww.. I was just worried I missed a memo. I can hold on to my 1986 “Using MIDI” Guide for a bit longer. :)

  • @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:

    @brambos said:

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device.

    No, the timing takes place in the operating system in that case, because MIDI itself is a realtime protocol and doesn't use timestamps.

    Not necessarily. For example, if you use the Unitor 8 (connects via a serial port) with emagic (now Apple) Logic, Logic actually queues up the MIDI events along with timestamps inside the Unitor 8, which itself then takes care of timing the events on its 8 MIDI output ports. Windows 95 simply wasn't a very good realtime operating system :D

    But yeah, on iOS, you're right of course... I was talking in a more general sense :)

    (sorry, I'm old!)

    No, I wasn't talking about Windows or digital transmission of MIDI between operating systems.

    I'm talking about hardcore oldschool MIDI: going through a DIN-cable into a hardware synth. For me that's the pure MIDI protocol (without platform/OS specific proprietary additions). There is no timestamping in MIDI; so when a note on event is sent to my Juno 106, there is no timing handling inside the synth. All the timing is done in the software: i.e. in the sequencer or in the OS.

    Right! And because that always led to problems especially with slow hardware and non-realtime-safe operating systems, companies like emagic came up with stuff like Logic and the Unitor 8, what I described above. That combo was the first time ever I even experienced absolutely spot on timing, and that has converted me forever into the "timing Nazi" I've become ;)

    But again yes, you're right of course about the barebones MIDI protocol itself. I was kind of talking about the bigger picture.

    (btw: it seems like even back then, they hit a nerve with that innovation... the Unitor 8 was "just" an 8 port in/out MIDI interface and cost a mere $1000 or so IIRC! So people apparently were in the market for good timing :))

    Now I understand what you mean, but I wouldn't expect even a tiny minority of users to have such exotic in-between-hardware. Especially when hooking up iOS devices to hardware setups. If you send MIDI into hardware, you can have all the timestamping you want but no sequencer-implementation will ever save you from the jitter inherent in the realtimeness of the technology.

    Yep, sorry for taking the thread away... sometimes I enjoy indulging in my strangeness ;)

    Naww.. I was just worried I missed a memo. I can hold on to my 1986 “Using MIDI” Guide for a bit longer. :)

    Haha :) well, essentially we can agree on everything then: Because the MIDI protocol is so limited timing-wise as you described, lots of odd innovations were cobbled together to deal with that as I described ;)

  • @SevenSystems said:
    Yep, sorry for taking the thread away... sometimes I enjoy indulging in my strangeness ;)

    No worries. Very interesting. Not that I follow it 100% but the parts I do are interesting.

    @SevenSystems said:
    BTW: all of this actually only applies to LIVE use. If you use a correctly implemented (!) sequencer like Xequence which queues up MIDI events in a buffer with timestamps, then none of this should be an issue because the events get sent long before they're actually supposed to be played and the timing should take place inside the target device or app.

    Yes, all my issues that led me to start the thread are about live input. It all works no problem if recorded and quantized or close.

    Still stress testing the Ableton Link + Clock route but it is working so far. [knocks on wood]

  • edited May 15

    @SevenSystems said:
    To me it would seem that as soon as USB is involved in any way, you're screwed.

  • A whole world of ground loops, jitter and strange cables that only go in in one orientation despite appearing to be deliberately designed to make this a challenge.

  • I don't think there's necessarily anything wrong with USB midi - but more that> @SevenSystems said:

    I can't say anything with absolute certainty, but if you use a MIDI interface with "real" MIDI ports that connects to the host via USB, then you have TWO factors introducing problems: the USB<>Interface connection which is not realtime-safe (jitter), AND the Interface<>Device connection over MIDI, which has the usual limited bandwidth, so gets more latency the more events are sent at once.

    I don't think there's any reason why a USB midi connection couldn't be made realtime safe - more the fact that most computers (Linux is the exception here) are not realtime safe. And so as soon as you introduce USB midi, you're introducing a computer that adds jitter, etc.

    It's a shame that Linux has never really taken off for audio/midi - as once it's configured properly you can get rock solid timing and super low latencies.

  • Windows 95 simply wasn't a very good realtime operating system :D

    Actually it wasn't terrible if you configured it right. A surprising number of old windows 95 boxes being used out there for that exactly that purpose in factories. The reason computers today are (mostly - Linux does it right, and there are specialty OSes like QNX/Wind River etc) not rock solid on timing is because they have pre-emptive scheduling. Windows 95 (and pre OSX macs) did not - which made them awful in all kinds of ways, but did mean that you could write predictable timing code so long as you controlled ALL the software running on the machine.

  • edited May 15

    @iamspoon said:
    A whole world of ground loops, jitter and strange cables that only go in in one orientation despite appearing to be deliberately designed to make this a challenge.

    It's not so bad. Jitter and all it's still in the realm of acceptable.

    https://www.innerclocksystems.com/litmus

    Roland booteek shit is worse.

  • @cian said:
    I don't think there's necessarily anything wrong with USB midi - but more that> @SevenSystems said:

    I can't say anything with absolute certainty, but if you use a MIDI interface with "real" MIDI ports that connects to the host via USB, then you have TWO factors introducing problems: the USB<>Interface connection which is not realtime-safe (jitter), AND the Interface<>Device connection over MIDI, which has the usual limited bandwidth, so gets more latency the more events are sent at once.

    I don't think there's any reason why a USB midi connection couldn't be made realtime safe - more the fact that most computers (Linux is the exception here) are not realtime safe. And so as soon as you introduce USB midi, you're introducing a computer that adds jitter, etc.

    As mentioned, I'm not 100% sure, but I think the mere fact that USB is a very complex protocol with two-way communication and error correction etc. adds lots of non-deterministic overhead. Plain MIDI is one-way, "shoot and forget".

    It's a shame that Linux has never really taken off for audio/midi - as once it's configured properly you can get rock solid timing and super low latencies.

    Well, that's basically what iOS is. Yes it's based on FreeBSD, but on the whole it does a pretty good job on uniting the best of all worlds in a single OS.

  • edited May 15

    @cian said:
    It's a shame that Linux has never really taken off for audio/midi - as once it's configured properly you can get rock solid timing and super low latencies.

    I used Linux for a while during law school. To start, basically, I was too poor to afford a new computer. Linux really revitalized an old Dell I had.

    But I stopped after law school. I was watching one of the Terminator movies and realized....this computer has NEVER crashed since I put Red Hat on it. Got me worrying. Went back to Windows because I love humans.

  • @SevenSystems said:

    @cian said:
    I don't think there's necessarily anything wrong with USB midi - but more that> @SevenSystems said:

    I can't say anything with absolute certainty, but if you use a MIDI interface with "real" MIDI ports that connects to the host via USB, then you have TWO factors introducing problems: the USB<>Interface connection which is not realtime-safe (jitter), AND the Interface<>Device connection over MIDI, which has the usual limited bandwidth, so gets more latency the more events are sent at once.

    I don't think there's any reason why a USB midi connection couldn't be made realtime safe - more the fact that most computers (Linux is the exception here) are not realtime safe. And so as soon as you introduce USB midi, you're introducing a computer that adds jitter, etc.

    As mentioned, I'm not 100% sure, but I think the mere fact that USB is a very complex protocol with two-way communication and error correction etc. adds lots of non-deterministic overhead. Plain MIDI is one-way, "shoot and forget".

    There's no inherent reason why this should be the case, though I'm not super familiar with USB so it's possible I guess. But it's more likely to be due to issues surrounding the switch from user-space to kernel-space and the scheduler. These things are usually pretty fast, but not reliably fast (e.g. real-time), hence jitter.

    It's a shame that Linux has never really taken off for audio/midi - as once it's configured properly you can get rock solid timing and super low latencies.

    Well, that's basically what iOS is. Yes it's based on FreeBSD, but on the whole it does a pretty good job on uniting the best of all worlds in a single OS.

    The Linux and FreeBSD kernels are pretty different FWIW. And these days the FreeBSD kernel and OSX have a lot of differences.

    IOS (and OSX) don't give you real-time guarantees. You can build a version of FreeBSD with Soft Real time scheduler, but OSX doesn't do that. And I don't know how good it is (the Linux version is very good - though Linux out of the box is plenty good for all but the most demanding of users).

    There are a couple of exceptions to do with audio (on the system side) which almost didn't happen... Based upon a conversation with the guy who is responsible for OSX (and IOS) having realtime audio that doesn't suck :smile:

  • @cian, you're probably more knowledgeable about those operating system internals than me, so I can't participate thoroughly in that discussion :) Though I do also remember that back when I built my last "Music PC" (that was in 2003!), I was told by everyone that I should avoid USB like the plague, and even turn it off completely in the BIOS, and use only actual PCI cards for the audio interfaces, and serial/parallel port MIDI interfaces (I settled on the Unitor 8 as outline above because it had the timestamping going on).

    To the system's credit, it performed absolutely reliably at 256 samples buffer size in pretty much any situation except if I really went out of my way to make it crackle (and that was on a single-core P4 3.2 GHz!) :)

  • @BroCoast

    Just occurred to me when I was reading another thread, couldn't Cubasis be affecting the results due of your tests due to it's (relatively) low MIDI resolution, 48 ppq I believe?

  • edited May 16

    Mis-post

Sign In or Register to comment.