AU and IAA Taxonomy

There is a lot of confusion about the various flavors of AUs and IAA apps. I would love to come up with a (semi)definitive taxonomy for the wiki.

This will be helpful both for understanding the types of AU and IAA for us users -- but will also help us better identify app and plug-in capabilities on the wiki.

@brambos @Michael @j_liljedahl (apologies for at-ting you all: but you all know what you are talking about)

Comments

  • McDMcD
    edited May 13

    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    AudioBus

    AudioBus provides similar API's for Audio/MIDI interconnecting.

    Audiobus defines three different audio/MIDI capabilities: sending, filtering and receiving.

    Audio senders transmit audio/MIDI to other apps (audio receivers or filters) or to the device's audio output device.

    Audio filters accept audio/MIDI input, process it, then send it onwards to another app over Audiobus. This allows applications to apply effects to the audio stream. Audio filters also behave as inputs or receivers, and can go in the "Input" and "Output" positions in Audiobus.

    Audio receivers accept audio from audio sender or filter apps. What is done with the received audio depends on the app.

  • edited May 13

    Oooh, nice!

    Any time someone talks about taxonomy and IA (information architecture) it immediately gets my attention.

    Good idea. I'll give this some thought -- do some card sorting -- and get back to you shortly with a (hopefully) helpful contribution of my own.

  • wimwim
    edited May 13

    We also need an App Taxidermy for apps that are no longer supported on current iOS versions. ;)

  • _ki_ki
    edited May 14

    Some remarks (to be incorporated into the statements of @McD ) :
    .

    IAA and AUv3 are music related programming interfaces from Apple.

    • IAA allows to build standalone fullscreen apps with audio and midi connection to an IAA host app.
    • AUv3 is a more advanced plugin format that allows for multiple instances run inside of an AU host.

    Audiobus SDK is a proprietary programming interface by Audiobus Pty Ltd layering on top of IAA offering additional features when loaded into the Audiobus host app.

    An app can support multiple programming interfaces, so there are AB3 (which always support the IAA interface) that also implement AUv3. IAA, AU and AB3 are used as common abbreviations to classify music apps.

    .

    The Audiobus 3 SDK offers the following additional features:

    • App switching with the AB3 panel
    • State saving (if implemented by the app)
    • Remote control using additional trigger buttons in the AB3 panel (if implemented by the app)

    .
    .

    IAA apps without state-saving AB3 extension will reload the last used configuration if they follow Apples app design rules, others always start with default settings. The user has to save/load the presets when changing sessions in the host. The IAA app have to implement their own preset management systems.

    AUv3 save and restore their settings with the AU hosts session. The AUv3 host can offer a host-local preset management system, an AUv3 can additionally implement host-independant preset management.

    .

    The Audiobus 3 app is the only host for AB3 SDK apps, enabling the AB panel and AB state-saving. It also supports standard IAAs and AUv3 plugins.

    There are many hosts for IAA and/or AUv3 plugins.

    An AUv3 plugin can not take the role of hosting IAAs or other AUv3.

  • wimwim
    edited May 14

    @_ki said:
    IAA apps without state-saving AB3 extension will reload the last used configuration, ...

    Great stuff. :)
    I think this part may not be true for all apps, though. I think some open to a default preset each time.

  • _ki_ki
    edited May 14

    @wim said:
    I think this part may not be true for all apps, though. I think some open to a default preset each time.

    You are right.

    I edited the post and added the remark „if they follow Apples app design rules“ :)

  • Has anyone else noticed that IAA sync does not work anymore? At least with any app I have that supported it.

  • @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

  • @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

  • edited May 14

    @BroCoast said:

    @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

    AU itself does not handle sync. It simply tells the plugin the current tempo and position in the timeline (currentBeatPosition, etc.) and its then up to the plugin to translate that into the desired synchronization method. There is no inherent jitter in AU, that's totally down to the individual plugins.

    Edit: incoming MIDI/AU events are timestamped. If a plugin takes the easy way out and ignores the timestamp and simply handles all of them at the beginning of the buffer render call this could lead to jitter.

  • edited May 14

    @brambos said:

    @BroCoast said:

    @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

    AU itself does not handle sync. It simply tells the plugin the current tempo and position in the timeline (currentBeatPosition, etc.) and its then up to the plugin to translate that into the desired synchronization method. There is no inherent jitter in AU, that's totally down to the individual plugins.

    Thanks :smile:

    From what I could gather from rather unscientific testing today:

    Link has no inherent jitter, it has consistent latency that increased in amount when I used more apps.
    AU (using Rozeta Bassline) had a small amount of jitter but almost no latency.

    That was triggering a hardware synth. I am just trying to get a definitive understanding of how the various sync methods work on iOS.

  • edited May 14

    @BroCoast said:

    @brambos said:

    @BroCoast said:

    @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

    AU itself does not handle sync. It simply tells the plugin the current tempo and position in the timeline (currentBeatPosition, etc.) and its then up to the plugin to translate that into the desired synchronization method. There is no inherent jitter in AU, that's totally down to the individual plugins.

    Thanks :smile:

    From what I could gather from rather unscientific testing today:

    Link has no inherent jitter, it has consistent latency that increased in amount when I used more apps.
    AU (using Rozeta Bassline) had a small amount of jitter but almost no latency.

    That was triggering a hardware synth. I am just trying to get a definitive understanding of how the various sync methods work on iOS.

    As far as I'm aware, timing of events in Rozeta Bassline should be sample-accurate (AU MIDI I/O handling happens on the audio-rendering-thread), so any jitter is happening downstream in the chain.

  • @brambos said:

    @BroCoast said:

    @brambos said:

    @BroCoast said:

    @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

    AU itself does not handle sync. It simply tells the plugin the current tempo and position in the timeline (currentBeatPosition, etc.) and its then up to the plugin to translate that into the desired synchronization method. There is no inherent jitter in AU, that's totally down to the individual plugins.

    Thanks :smile:

    From what I could gather from rather unscientific testing today:

    Link has no inherent jitter, it has consistent latency that increased in amount when I used more apps.
    AU (using Rozeta Bassline) had a small amount of jitter but almost no latency.

    That was triggering a hardware synth. I am just trying to get a definitive understanding of how the various sync methods work on iOS.

    As far as I'm aware, timing of events in Rozeta Bassline should be sample-accurate (AU MIDI I/O handling happens on the audio-rendering-thread), so any jitter is happening downstream in the chain.

    You're correct. The jitter is from iPad>audio interface MIDI>synth

    What threw me was there is no jitter with Link using the same setup, just a fixed amount of latency. Not sure how that works. :lol:

    When triggering AU or IAA synths it is indeed sample accurate.

  • @BroCoast said:

    @brambos said:

    @BroCoast said:

    @brambos said:

    @BroCoast said:

    @j_liljedahl said:

    @McD said:
    Here are some details to kick-start the discussion. I added AudioBus as the 3rd option
    since it's still implemented in a many Apps along with AUv3.

    Inter-App Audio (IAA)

    Inter-App Audio (IAA) was developed by Apple in 2013 for IOS 7 to route audio and MIDI signals between a host application and multiple node application. There are 3 types of IAA nodes:

    Instruments (can receive MIDI signals and produce audio signals)
    Generators (can produce audio signals)
    Effects (can receive, transform and send back audio signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    Audio Units (AU)

    Audio Units (AU) are a plug-in architecture in Core Audio. Audio Units are Apple's architectural equivalent to desktop Virtual Studio Technology (VST) plug-ins.

    AU Types include:

    Instruments (can receive MIDI signals and produce audio signals)

    and MIDI messages.

    Generators (can produce audio and/or MIDI signals)
    Effects (can receive, transform and send back audio and/or MIDI signals)

    MusicEffects (can receive MIDI messages, receives and processes audio signals)

    MIDIProcessors (can receive MIDI, produces MIDI, no audio)

    How does AU handle sync compared to Ableton Link? AU seems to suffer from jitter while Link does not seem to?

    AU itself does not handle sync. It simply tells the plugin the current tempo and position in the timeline (currentBeatPosition, etc.) and its then up to the plugin to translate that into the desired synchronization method. There is no inherent jitter in AU, that's totally down to the individual plugins.

    Thanks :smile:

    From what I could gather from rather unscientific testing today:

    Link has no inherent jitter, it has consistent latency that increased in amount when I used more apps.
    AU (using Rozeta Bassline) had a small amount of jitter but almost no latency.

    That was triggering a hardware synth. I am just trying to get a definitive understanding of how the various sync methods work on iOS.

    As far as I'm aware, timing of events in Rozeta Bassline should be sample-accurate (AU MIDI I/O handling happens on the audio-rendering-thread), so any jitter is happening downstream in the chain.

    You're correct. The jitter is from iPad>audio interface MIDI>synth

    What threw me was there is no jitter with Link using the same setup, just a fixed amount of latency. Not sure how that works. :lol:

    When triggering AU or IAA synths it is indeed sample accurate.

    Hmmm yeah there are some black-boxes in the chain, like the CoreMIDI framework. We can only guess what happens in there in terms of buffering, queueing and thread-prioritizing. It may even be different between different devices or iOS versions.

Sign In or Register to comment.