Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

[Audiounit programming] - Develop an Audiounit in Swift

I have an application that should also work as an AudioUnit plugin. In Xcode, I go to File > New > Target, select AudioUnit and make sure that the selected language is "Swift". In the generated code, though, I have that the actual code of the plugin is within "h", "m" Objective C Files:

#import "ChordezAuAudioUnit.h"

#import <AVFoundation/AVFoundation.h>

// Define parameter addresses.
const AudioUnitParameterID myParam1 = 0;

@interface ChordezAuAudioUnit ()

@property (nonatomic, readwrite) AUParameterTree *parameterTree;
@property AUAudioUnitBusArray *inputBusArray;
@property AUAudioUnitBusArray *outputBusArray;
@end


@implementation ChordezAuAudioUnit
@synthesize parameterTree = _parameterTree;

- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription

[...]

How do I develop the plugin in Swift? In this Github project, the author seems to be doing it, but I don't know how to replace the generated code with the Swift one: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/main/TransposeOctaveAudioUnit/MIDIAudioUnit/MIDIAudioUnit.swift

Comments

  • My understanding is that this is a big no-no: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/2ddab2a01fb325c8f46c888026c2365b37daa2a4/TransposeOctaveAudioUnit/MIDIAudioUnit/MIDIAudioUnit.swift#L67

    Don't use swift for audio unit kernels. To quote Apple: "Capture in locals to avoid Obj-C member lookups. If "self" is captured in render, we're doing it wrong. See sample code.". Same applies to Swift AFAIK, but I'm happy to be proven wrong.

  • Exactly. As I understand it, the smart features that make Swift and Objective-C work add overhead that interferes with real-time processing. Remember, the audio processing code has to run N times a second, in perfect lock-step, to avoid audible glitches. So the coding needs to be closer to the "bare metal" than those languages allow.

    You should take a look at this thread, https://forum.audiob.us/discussion/42544/x-was-about-programming-au-and-which-language-to-use/p1, which has a lot of discussion on these issues.

  • @pistacchio said:
    I have an application that should also work as an AudioUnit plugin. In Xcode, I go to File > New > Target, select AudioUnit and make sure that the selected language is "Swift". In the generated code, though, I have that the actual code of the plugin is within "h", "m" Objective C Files:

    [...]
    How do I develop the plugin in Swift? In this Github project, the author seems to be doing it, but I don't know how to replace the generated code with the Swift one: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/main/TransposeOctaveAudioUnit/MIDIAudioUnit/MIDIAudioUnit.swift

    Do check out the thread @uncledave linked.

    The issue you have to deal with in any language you use to code the DSP with is that you are in a soft real-time setting. There is an upper bound to the amount of time you have to execute your code in each of the pulls the audio thread makes on your DSP method. So, you can't do anything that takes unbounded time. This is true in whatever language you use -- even C and C++. So, you can't make system calls, like memory allocation or deallocation, in your DSP code. It's hard to not do this in languages like Swift (or Objective-C) where there is lots going on in the runtime support for the language.

    In the above mentioned thread, I posted a project that demonstrates how to use Swift only to bridge out to other languages while not using Objective-C. The point of that project is not to use Swift as the language you do your DSP in. It is possible, but don't do it! It is hard not to mess up and make a call that is unbounded in Swift. It is pretty easy in C. It's a little bit harder in C++. And, that pretty much explains why those are the languages of choice for almost all of this kind of work.

    BTW, the template that you get from Xcode right now is a bit of a mess. It puts in place the files you need for doing dev in both Swift and Objective-C and both are kind of confused. Look at Apple's demonstration code on building an Audio Unit in the dev docs. This will show you how to bridge to C++ using a Swift AUAudioUnit class that uses an Objective-C++ bridge to get to C++ code to do the DSP. You basically need one small Objective-C++ shim layer.

Sign In or Register to comment.