Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

development platform question

As both a musician and longtime computer techie, I've jumped into Swift development (because I can B) ). I'm sure it will be a while before I can create anything useful to others, but as I get further along in the learning process, I wonder about what "real" developers do.

Are most of the many excellent apps discussed in this forum developed in Swift or in Objective-C (or something else?)?

I'm beginning to sense that both disciplines are important. Objective-C is just another language, but it makes me wonder whether I'll have to learn it as well in order to accomplish any serious tasks.

Comments

  • @motmeister I believe the actual audio processing, certainly in an AUv3, has to be coded in C++ or even plain C. The higher-level languages include automations that make them incompatible with real-time processing. So you really need a spectrum of capabilities to build music apps.

  • Check out the "AudioKit" project for a lot of Open Source code and examples to get you making apps fast.

    https://audiokit.io/

  • @motmeister said:
    As both a musician and longtime computer techie, I've jumped into Swift development (because I can B) ). I'm sure it will be a while before I can create anything useful to others, but as I get further along in the learning process, I wonder about what "real" developers do.

    Are most of the many excellent apps discussed in this forum developed in Swift or in Objective-C (or something else?)?

    I'm beginning to sense that both disciplines are important. Objective-C is just another language, but it makes me wonder whether I'll have to learn it as well in order to accomplish any serious tasks.

    If you want to do AUv3's you are going to have to know them all. Actually, you will need at least Objective-C and C++ or C at least for now. We are getting closer to the point where you will be able to work in only Swift and C or C++, but we aren't quite there yet.

    I've recently shifted to doing almost everything in Swift (UI and main Audio Unit code) with only a small layer of Objective-C++ bridging to the core audio processing in C++. The small layer of Objective-C++ is pretty much boilerplate and needs little work from one AU to another.

    The actual DSP (in what Apple labels the kernel in their template) has to be in some realtime capable language. You also need to be able to bridge to Swift or Objective-C.

    The easiest path for someone starting out is probably Swift for the UI and main AU class and then C++ for the DSP. There are lots of DSP examples out there that are mainly in C++. But, I hate C++ so I feel bad saying this. Still, it's mostly true.

  • All of your answers are eye-opening, but not frightening, I guess. I've written code in so many languages since 1967 that I've really come to understand that the differences are mostly form and syntax. I just never took the time to try C or C++ (or Objective-C), but with COVID, I certainly have time, right?

    As a result of seeing your comments, I'm going to Goog to find out how to learn some simple coding (to begin with) in both Objective-C and C++ on the Mac platform.

    I volunteer as a pianist at our community hospital, and as a result of their volunteer program, I was able to get the first of the two shots of Pfizer vaccine, and I'll get the second in a bit less than two weeks. That puts me ahead of the curve here, because vaccines have been slow to catch up with the need. Relative isolation is going to continue until the vaccine starts to significantly lower the rate of new infections, so I'll have lots of time to learn.

    I really work well from examples, so audio kit and open source will be very helpful to me. They certainly have been helpful in getting up and running in Swift.

    The members of this Forum are world-class! Your comments will guide me in the right direction. Thank you!

  • How did your next steps go? I’m also interested in learning, and already familiar with SwiftUI

  • As I understand it, new developers and kids who learn with Swift have an easier time of learning it as opposed to people coming from the position of an advanced developer learning about all of the vagaries of Swift.

  • After having done some research into what threading is going on in iOS for audio, I'm going to have to change my original answer to this question. Looking at the threads running in many AU's it's apparent that many (lots and lots) of AU's on iOS are written using JUCE. So, it's possible that C++ for UI's has the bigger usage over either Objective-C or Swift. With the way the VST3 library lets you do generation of AUv3 now, I'd expect that C++ for the UI might become more prevalent too. There still needs to be an Objective-C/Swift bridge in there somewhere, but it's something you may never see going this path.

  • Swift has a lot of complexity and modern concepts that support advanced features like SwiftUI but aren’t necessary to learn in and out for day to day application development (speaking as a Swift dev who comes from web dev background)

  • Anyone knows if it’s possible to create an AUv3 plugin using VST3 dev kit and web technologies for the qui.
    @blueveek Use some CSS for the UI of Atom2 how does he proceed?

  • @Jeezs said:
    Anyone knows if it’s possible to create an AUv3 plugin using VST3 dev kit and web technologies for the qui.
    @blueveek Use some CSS for the UI of Atom2 how does he proceed?

    I don't know anything about using the VST3 SDK to generate AUv3 other than it is supposed to be possible according to Steinberg. So, figuring out how to use HTML/Javascript/CSS in VST3 C++ UI is something I don't know if it's possible or not.

    From the AUv3 standpoint, there are really only three interfaces you have to support and they are all Objective-C or Swift API's. They are thin though in that you can use them to bridge to underlying implementations fairly easily. The three things are the AUAudioUnit instance, the AUParameterTree instance, and whatever class you have setup to provide the AUAudioUnitFactory protocol. the AUAudioUnit and AUParameterTree instances are the interfaces for the host and the OS to communicate with the AU. They are also the way information from the host, OS, and the main processing code communicates to the UI. So, within that structure, you can implement the UI pretty much anyway you want as long as what you are using will run in the view instance that is provided to you. For example, you could use the UIView provided in the standard AUv3 template and populate it with web views and then use Javascript to communicate to the AUParameterTree.

    I'd be careful about possible performance issues with this though. For example, in the AU I'm working on now, one of the UI elements has to be written using Metal to be able to get good enough performance to run on older iPhones. Even just trying to do it in a straight Swift setting isn't fast enough.

  • @NeonSilicon said:
    After having done some research into what threading is going on in iOS for audio, I'm going to have to change my original answer to this question. Looking at the threads running in many AU's it's apparent that many (lots and lots) of AU's on iOS are written using JUCE. So, it's possible that C++ for UI's has the bigger usage over either Objective-C or Swift. With the way the VST3 library lets you do generation of AUv3 now, I'd expect that C++ for the UI might become more prevalent too. There still needs to be an Objective-C/Swift bridge in there somewhere, but it's something you may never see going this path.

    Are there still fears about the demise of JUCE?

  • @AlmostAnonymous said:

    @NeonSilicon said:
    After having done some research into what threading is going on in iOS for audio, I'm going to have to change my original answer to this question. Looking at the threads running in many AU's it's apparent that many (lots and lots) of AU's on iOS are written using JUCE. So, it's possible that C++ for UI's has the bigger usage over either Objective-C or Swift. With the way the VST3 library lets you do generation of AUv3 now, I'd expect that C++ for the UI might become more prevalent too. There still needs to be an Objective-C/Swift bridge in there somewhere, but it's something you may never see going this path.

    Are there still fears about the demise of JUCE?

    I don't know about the current state of any concerns about JUCE. I've never really liked either the technical side of JUCE or its licensing model, so I don't keep up with it much. I do know that they were sold to PACE which killed off any remaining interest I might have had.

  • Yeah, everyone was wondering if one day it would get locked down or have some crazy licensing.
    Knowing PACE, I'm sure that day will come one day.

Sign In or Register to comment.