Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Digital Signal Processing with iOS/iPadOS tools

As some of you may be aware, Sean Costello of Valhalla DSP fame has been publishing a series on his blog called 'Getting Started with Reverb Design' on his blog recently. And it should come as no surprise that it's a damned fine read.

https://valhalladsp.com/blog/

In the first post, he talks about development environments for DSP, and interestingly he suggests that much as the Juce SDK and C++ is utilised for his final plugin publishing workflow, he finds Cycling '74's Max and the Max for Live environment in Ableton as perfect environments for prototyping. He also discusses Bitwig and Reaktor but only as suggestions, his experience is with Max.

This has made me wonder as to what will be the best option for prototyping DSP tools on iOS/iPadOS. Audulus, which has its v4 public beta in progress right nows seems to be the best fit. But I'm not sure that it's a true alternative to Cycling '74 Max, however, I'm interested in other viewpoints? And is there anything else that I'm missing on iOS? I was even wondering if Swift Playgrounds could be utilised for prototyping?

I must admit that after my initial excitement regarding Audulus, I found myself always returning to Max/Max for Live, as the Cycling '74 product has some of the best help documentation I've ever encountered in a development product of this nature. Plus it has a huge library of API assets you can lean on, rather than having to build up everything at an atomic level.

Anyway, over to you more technically minded folk in the Audiobus community. Really interested to hear your thoughts and suggestions.

Comments

  • Apple's restrictions on what you can do on iOS make the tools I would suggest using mostly impossible. I'm talking about things like SuperCollider or Csound that are more aimed at a live coding environment for audio. If it were possible I'd really suggest looking into Faust. There is some stuff you can do with Faust in a web browser but I don't think this makes for a great prototyping workflow.

    Personally, I prototype on the Mac and I do this using C and C++ in some skeleton AU's I have setup to prototype in. Even when I go to build an AU for release on iOS, I do the entire audio path dev and testing on the Mac before I do any testing on an iOS device. It's easier to test and verify on the Mac. Only when I get to the point of working on the touch portions of the UI do I bring that onto an iOS device. And I typically do the development of new UI parts outside of an AU or audio context on the iOS device.

  • I had a feeling that would be the case. It's such a pity that there isn't a true equivalent of Max on iOS. I'm surprised Cycling '74 and Ableton haven't looked into this, as Max for Live is effectively a walled garden so iOS/iPadOS file system restrictions wouldn't get in the way. iPadOS coupled with a recent generation iPad certainly has the processing grunt to deliver a Max/Ableton mobile environment.

    I dabbled with CSound and SuperCollider back in the day but I found my groove more when I discovered Curtis Roads many books and that brought me to Max and later Max for Live. A few other connected computer music books made for good reading - The Audio Programming Book (edited by Richard Boulanger and Victor Lazzirini), which led to other classics by Max V Mathews (The Technology of Computer Music is an essential read even though it was published in 1969), and Barry Vercoe's The CSound Book. Much as I stick to Max, I found learning about techniques in V Music and CSound very inspiring. The book I find myself dipping into a lot at the moment is John R Pierces The Science of Musical Sound.

    I have no real interest in creating plugins but I do love creating little utilities for my own work in Ableton and Max for Live is perfect for that scenario. Audulus would be so much more appealing if your creations could be standalone AUv3s, but as you say, Apple's restrictions would never allow for something like that.

  • @jonmoore said:
    I had a feeling that would be the case. It's such a pity that there isn't a true equivalent of Max on iOS. I'm surprised Cycling '74 and Ableton haven't looked into this, as Max for Live is effectively a walled garden so iOS/iPadOS file system restrictions wouldn't get in the way. iPadOS coupled with a recent generation iPad certainly has the processing grunt to deliver a Max/Ableton mobile environment...

    It isn't terribly surprising that Cycling 74 or Ableton haven't come to iOS: making money on iOS is very very hard -- particularly for large-scale products. They may well have looked into it and decided "yeah, no"

  • @espiegel123 said:

    @jonmoore said:
    I had a feeling that would be the case. It's such a pity that there isn't a true equivalent of Max on iOS. I'm surprised Cycling '74 and Ableton haven't looked into this, as Max for Live is effectively a walled garden so iOS/iPadOS file system restrictions wouldn't get in the way. iPadOS coupled with a recent generation iPad certainly has the processing grunt to deliver a Max/Ableton mobile environment...

    It isn't terribly surprising that Cycling 74 or Ableton haven't come to iOS: making money on iOS is very very hard -- particularly for large-scale products. They may well have looked into it and decided "yeah, no"

    I think that's changing. With the likes of Steinberg and Avid successfully bringing apps like Dorico and Sibelius to the platform as extensions of desktop ownership rather than a profit centre in their own right. And the iOS versions of their apps aren't painfully cut down versions either. iPadOS may be riddled with design patterns that make it difficult to deliver desktop-class user experiences (especially for audio), but it's still a very powerful OS coupled with best-in-class mobile processing capabilities. Adobe uses a similar business model with regards to Photoshop & Premier. These are also desktop-class user experiences that are provided as an extended value proposition to Adobe Cloud subscribers. This is the way to make money providing desktop-class applications on iOS.

  • @jonmoore : those developers decided it was worthwhile to try…some other developers look at the numbers and decide their money would be spent better elsewhere.

    I know a number of developers that gave a lot of thought to iOS and doing due diligence decided not to get involved at this time or decided to reduce their involvement.

    Steinberg and Avid making those decisions doesn’t mean that the decision is an easy “yes” or would be the right decision for Ableton or Cycling74.

  • You can't do things like Reaktor or Max Gen when you can't compile and run code from an App, for security reasons. PD has already been ported several times, but the processing part, lacking the UI, which would be much more effort.

  • @Max_Free said:
    You can't do things like Reaktor or Max Gen when you can't compile and run code from an App, for security reasons.

    But you can export code from gen~ and use it in your iOS project
    For example, Moebius Lab's audio engine was programmed in gen~

  • @yug said:

    @Max_Free said:
    You can't do things like Reaktor or Max Gen when you can't compile and run code from an App, for security reasons.

    But you can export code from gen~ and use it in your iOS project
    For example, Moebius Lab's audio engine was programmed in gen~

    You can use Faust to generate code that you can use in your iOS AUv3 too, but you have to take the step of going through Xcode on the Mac to make this work. As @Max_Free says, you can't directly compile on iOS and you can't even use a JIT compiler in memory. This is actually a huge limitation on iOS. It makes it unusable for entire categories of applications that the iPad would otherwise be ideal for.

  • edited October 2021

    @yug said:

    @Max_Free said:
    You can't do things like Reaktor or Max Gen when you can't compile and run code from an App, for security reasons.

    But you can export code from gen~ and use it in your iOS project
    For example, Moebius Lab's audio engine was programmed in gen~

    I've been thinking a lot about the Amazing Noises/Apesosft approach to iOS development as I knew many of their products via Max and Max for Live before their iOS versions were released. The interesting thing now is that Max Gen~ is still used a lot in the dev phase before iOS only versions of their apps are released.

    As mentioned before I'm not really interested in publishing iOS/iPadOS apps, but who knows if I create something in Max that I think would be useful in iOS it may become an option (if I felt my Max patch was strong enough and providing something unique).

  • @Max_Free said:
    You can't do things like Reaktor or Max Gen when you can't compile and run code from an App, for security reasons. PD has already been ported several times, but the processing part, lacking the UI, which would be much more effort.

    I steer clear of PD (Pure Data) even though it's effectively a free open source variant of Cycling '74's Max. The thing I'm happy to pay for with Max is the incredible documentation and example help files. I'm not a programmer/developer so my explorations of Max are heavily reliant on the documentation. I attempt to keep up with new developments to Miller Puckettes open source project, but only at a very high level.

    My digital audio programming skill level requires the hand holding of Cycling '74's Max.

  • edited October 2021

    But Swift Playground compiles properly on iOS and should work for DSP experiments, correct?

    I remember someone programmed a sampler in Swift and released it open source. One could start from this.

  • @Max_Free said:
    But Swift Playground compiles properly on iOS and should work for DSP experiments, correct?

    I don’t know enough about it, and was hoping things had moved on enough that simple experiments might work. But the consensus seems to be no.

  • Found it:

    https://github.com/AudioKit/ROMPlayer

    I never tried it out, dunno if you need to use XCode and a to aquire a developer license for this. Or will it just run on an iPad?

  • Swift for audio work has some definite issues. The foundational problem is that you can't guarantee the amount of time it takes to run any particular piece of code. There are ways to try and work around the issues, but nothing is going to make it really viable to do realtime work directly in Swift -- right now. There is the possibility of some changes in Swift making it possible and there is work being done in this area.

    I have a project on Github that is basically nonsense, but does explore the idea of bridging out from Swift to do DSP. At this point I had to drop into C to make doing anything possible. I don't think this is a possibility with Playgrounds at this point because it is mainly aimed at exposing SwiftUI to people.

  • I haven't looked at AudioKit in a couple of years, but my recollection of the system was that you could but things together at the level of Swift but that the underlying audio (DSP) code was done in C. I also have no idea if you could use the underlying AudioKit library in Swift Playgrounds. Maybe, but I doubt it.

  • From looking at the code and their comment about the iOS version of the playground being less capable, I think that you can't actually get at doing anything at the C level in the iOS version including linking to the underlying libraries. It does look like all of the AVFoundation layers and vDSP layers are there and that would give you some powerful tools to play with. But to actually use these things from Swift in an app would be risky. There are issues with potential uncontrolled memory creation when accessing the low level pointer type structures from Swift that you just can't do in an RT audio thread. That's why in my little example program on doing an AUv3 from Swift I did all of those parts from a C library. The vDSP libraries appear to all be safe to use, but getting the data in and out of them is problematic.

    This does look like a good way to learn some DSP concepts on the iPad and get some experimentation going.

  • Now No One can easily compete iOS.

Sign In or Register to comment.