Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Is Swift enough for developing MIDI-based app?

2

Comments

  • Only thing I’ll throw in is, don’t let the technical stuff keep you from trying out an idea! Just make sure to keep a spirit of learning and willingness to improve.

    ZOA includes a mix of Swift, Objective-C, and C++, with all the AUv3 MIDI happening in C++ land. But when I got into development, I didn’t know any of this stuff. I’ve learned a lot along the way from great blog posts and info from @MichaelTyson and @j_liljedahl and others. I am always trying to improve my code and be a better iOS citizen when I learn new things.

    I’d recommend starting with this blog post to get a basic MIDI effect up and running. rockhoppertech.com/blog/auv3-midi/

  • @j_liljedahl said:
    Yes, CoreMIDI threads are ”high priority” but not audio thread. So blocking there might not have the same catastrophic consequences as in the audio thread, but it all adds up (between apps) and can lead to late events / bad timing.

    Not on IOS, but early versions (1.x) of Akai MPC Essentials show exactly this behavior with plain midi tracks routed to external gear, even with fairly low screen activity.
    I didn‘t setup a measuring scenario, but running the app with no windows (controlling from the MPC Studio) seemed to solve this.

  • @j_liljedahl said:
    Any code running on realtime thread must be C or C++, and be realtime safe. This includes MIDI, especially AUv3 MIDI.

    Or Rust. Though I think I'd wait a few years before trying to use it on IOS. Let someone else iron out all the problems.

    That said, you probably could get away with Swift. I know a guy who built a livecoding system, including DSP, in Java. It shouldn't work, but it does. So if you're messing around - try it and see. At the very least you'll learn something which will help when you have to make the transition to something else.

  • The way Tidal works is it sends signals (via Open Sound Control, which is a network protocol a bit like HTTP) to SuperCollider (a realtime audio language). It doesn't have to be realtime because there's a latency of about 100 milliseconds between Haskell sending the message, and the generation of audio. Which would suck if you were playing a keyboard, but you don't notice when coding.

    So SuperCollider handles all the realtime stuff, and Haskell doesn't really have to worry about being so efficient.

    If I was doing a livecoding thing on IOS I would probably do it this way. I'd have a real time library written in C/C++/whatever which takes message with a timestamp into an ordered queue. Then every buffer cycle this library would process any MIDI messages whose timestamp meant it was time for them to go out. On the non-RT side (lets say Swift for the sake of argument) my code would simply fill that buffer up with messages, and I'd have a latency of around 100miliseconds.

    With this approach the worst that would happen is that if Swift was slow some MIDI messages would be a little delayed. And it would mean that the bit that has to be realtime would be doing the minimum amount of work during that realtime cycle.

  • @Michael said:
    Swift is not ready for realtime yet though; it’s extraordinarily difficult to write anything that doesn’t violate one or more of the realtime coding principles. Real-time parts should really be written in C or C++. Unless you’re developing just a toy app and it doesn’t really matter if it glitches 😄

    Isn’t AudioKit based on Swift? Yikes.

  • @gusgranite said:

    @Michael said:
    Swift is not ready for realtime yet though; it’s extraordinarily difficult to write anything that doesn’t violate one or more of the realtime coding principles. Real-time parts should really be written in C or C++. Unless you’re developing just a toy app and it doesn’t really matter if it glitches 😄

    Isn’t AudioKit based on Swift? Yikes.

    Actually all of the real-time code is in C++. Only the interface is in swift

  • @Michael said:

    @gusgranite said:

    @Michael said:
    Swift is not ready for realtime yet though; it’s extraordinarily difficult to write anything that doesn’t violate one or more of the realtime coding principles. Real-time parts should really be written in C or C++. Unless you’re developing just a toy app and it doesn’t really matter if it glitches 😄

    Isn’t AudioKit based on Swift? Yikes.

    Actually all of the real-time code is in C++. Only the interface is in swift

    Good to know!

  • @moodscaper said:
    In my humble opinion / limited experience you can do a reasonable amount with Core MIDI and Swift. FWIW, all the MIDI stuff in my app touchscaper (including the sequencer) is in Swift.

    Sounds like what you want to do will entail lots of tinkering with MIDI packets. Gene De Lisa has done lots of investigation work around using Swift and Core MIDI as there is, to the best of my knowledge zero stuff about it on any Apple tech resource.

    If this looks a bit scary, then you're right it is, a bit. You kinda get used to it eventually :smile:

    rockhoppertech.com/blog/core-midi-midipacket-midipacketlist-and-builders-2/

    Thank you, Maestro Rob.

  • @CapnWillie said:
    I’ve favorited this thread and hope it continues to grow.

    So far I gather that beginner devs who wish to be strong ios coding Jedi one day should begin with learning:

    1. C, C++ or both
    2. Swift
    3. Create GitHub account, search and dive through working repos from strong Jedi like @Michael @j_liljedahl and others.
    4. Hop in and get messy asap.

    If the future iOS Jedi is wanting to do audio related dev, yes. Although, kinda maybe not. I just realized when reading through this thread again that my test AUv3 host that I've written to run on the simulators is 100% Swift. So, depending on what you want to do with audio, maybe you don't need C and C++.

    If you want to be a developer at all learn C. It's good for your mind and soul. Also, the "Core *" of macOS and iOS is written in C.

    C++ is not good for your mind and soul, but yes you should know it for doing audio dev. But, you don't actually have to.

    You need to know a bit of Objective-C/Objective-C++ to write AUv3 at this point. (Technically, you could skip this and bridge Swift to C directly but then you need to know Swift really well and where you might get caught by Swift not being RT safe.)

    You don't actually need to learn Swift at all today to do audio or MIDI apps. You can still write an entire AUv3 in Objective-C, UI and all --- could be that at some time Apple deprecates UIKit and goes all SwiftUI. I'll walk away from iOS dev in less than an instant if they do.

    If you use something like JUCE, you can stay in C++ because it'll handle the parts you need to bridge to the Objective-C/Swift interface.

    Since we are talking about Jedi, you should learn Haskell and Eiffel too. You'll probably never use them for anything, but you will know why.

    Yes, definitely, for your points 3 and 4. One thing to keep in mind though when looking at example AUv3 projects, it's hard to follow what does what, where, and when while looking through an AUv3 code example. It's not simple with a "main" function to look at and see where things go. Even the event loop is kinda hidden from you. When you are getting messy, draw yourself some diagrams and figure out how the communication happens host-to-AU and AU UI to the AUAudioUnit.

  • edited May 2022

    @NeonSilicon said:
    You don't actually need to learn Swift at all today to do audio or MIDI apps. You can still write an entire AUv3 in Objective-C, UI and all --- could be that at some time Apple deprecates UIKit and goes all SwiftUI. I'll walk away from iOS dev in less than an instant if they do.

    you don’t think SwiftUI will be advanced enough by that point?

  • @vasilymilovidov said:

    @NeonSilicon said:
    You don't actually need to learn Swift at all today to do audio or MIDI apps. You can still write an entire AUv3 in Objective-C, UI and all --- could be that at some time Apple deprecates UIKit and goes all SwiftUI. I'll walk away from iOS dev in less than an instant if they do.

    you don’t think SwiftUI will be advanced enough at that point?

    I don't think they will ever actually deprecate UIKit. SwiftUI is only one architectural way of organizing an application. It has strengths and weaknesses. UIKit is somewhat similar, but it is more open to various different ways of architecting a program. Limiting all application dev on iOS to one architectural pattern (Model-View-ViewModel) would be a huge mistake, even if it is fairly popular at this point. It's never going to be the case that one application architecture is going to cover all possible application types in a reasonable fashion.

  • Haha wow, good zombie thread awakening!

  • It seems that rust is fast as C / C++.
    Anyone knows if it can be used for real-time audio application and AUV3 developpement ?

  • @Jeezs said:
    It seems that rust is fast as C / C++.
    Anyone knows if it can be used for real-time audio application and AUV3 developpement ?

    Rust could be used for audio and AUv3 dev. But, it'll be a pain in some ways. You'll need to bridge Rust to the Swift/Objective-C portion of the AU. This would be possible using the C interface that Rust provides. You will have to write the glue code to do it.

    It's important to note that the issue with Swift and real-time dev isn't the speed of Swift. It's basically that the libraries and runtime environment for the Swift executable are very difficult to use in a way that is bounded in time. Swift in general is more than fast enough for doing MIDI dev in an AUv3 context. It's definitely easier in Rust to guarantee the time constraints needed for RT, but it's still possible to write Rust code that'll break in an AUv3 context.

    The biggest issue is going to be setting up a dev environment to program in. Xcode doesn't support Rust. You'll need to put together an IDE and build system that can handle everything. The CLI tools for Xcode will make this doable, but it'll be a pain.

  • @NeonSilicon said:

    @Jeezs said:
    It seems that rust is fast as C / C++.
    Anyone knows if it can be used for real-time audio application and AUV3 developpement ?

    Rust could be used for audio and AUv3 dev. But, it'll be a pain in some ways. You'll need to bridge Rust to the Swift/Objective-C portion of the AU. This would be possible using the C interface that Rust provides. You will have to write the glue code to do it.

    It's important to note that the issue with Swift and real-time dev isn't the speed of Swift. It's basically that the libraries and runtime environment for the Swift executable are very difficult to use in a way that is bounded in time. Swift in general is more than fast enough for doing MIDI dev in an AUv3 context. It's definitely easier in Rust to guarantee the time constraints needed for RT, but it's still possible to write Rust code that'll break in an AUv3 context.

    The biggest issue is going to be setting up a dev environment to program in. Xcode doesn't support Rust. You'll need to put together an IDE and build system that can handle everything. The CLI tools for Xcode will make this doable, but it'll be a pain.

    Thanks.
    I get the point now.

  • Btw UIKit is now deprecated

  • @wahnfrieden said:
    Btw UIKit is now deprecated

    Where are you hearing this? Seems like something they would have stated at WWDC. UIKit isn't marked as deprecated in the docs. A whole bunch of stuff was just added to UIKit for iPadOS support of "Desktop-class features."

  • edited July 2022



    revisit this portion, they clearly state that SwiftUI is their platform vision, and UIKit is lumped alongside Interface Builder as something that carried us far to where we are today and is no longer part of their current platform vision. The explicitness of a deprecated tag in their docs won't come until much later. They aren't overnight ending development of UIKit, there is still leftover momentum, but they are announcing in effect a deprecation

    I'd like to see your enthusiasm for UIKit's longevity match enthusiasm for Interface Builder which is very obviously a dead end. UIKit proponents elsewhere have gotten the signal with this WWDC's explicit announcement. People hanging on sound like the lingering ObjC champions

    UIKit, AppKit are in transitionary phases. They still provide the backing for most of SwiftUI. Doesn't take away from the fact that Apple has ended their role in their platform vision, or in other words, "deprecated" ("disapproved") them

  • So, it's not deprecated then. All of the new things they just added to UIKit and AppKit aren't wasted effort for no point,

    https://developer.apple.com/documentation/uikit
    https://developer.apple.com/documentation/appkit

    I'll start worrying about UIKit going away when Apple migrates Logic and FinalCut to SwiftUI.

  • edited July 2022

    Something can be deprecated and still improved until it's able to be sunset. Agree on Apple's own adoption as early signal, I don't expect much in next couple years for their existing software properties which aren't receiving a makeover for other product purposes

  • @wahnfrieden said:
    Something can be deprecated and still improved until it's able to be sunset. Agree on Apple's own adoption as early signal, I don't expect much until next couple years for their existing software properties

    Where does your information come from that it has been deprecated?

  • edited July 2022

    @espiegel123 said:

    @wahnfrieden said:
    Something can be deprecated and still improved until it's able to be sunset. Agree on Apple's own adoption as early signal, I don't expect much until next couple years for their existing software properties

    Where does your information come from that it has been deprecated?

    WWDC 2022, the segment I shared above where. They haven't marked these technologies as deprecated in docs but have announced their explicit disapproval for the tech in their platform vision as previous generation, no longer their focus. Apologies for loose usage of language but I think their statement is loud and clear. UIKit will look like ObjC does now, think back to when Swift's first year or two when people were still skeptical of Apple committing to it because it was still early.

  • edited July 2022

    @wahnfrieden said:
    Something can be deprecated and still improved until it's able to be sunset. Agree on Apple's own adoption as early signal, I don't expect much in next couple years for their existing software properties which aren't receiving a makeover for other product purposes

    That isn't deprecated. Deprecation is a formal process. Apple has not deprecated UIKit or AppKit.

    As a pointed out and you can see in the links above, Apple is developing and improving both AppKit and UIKit.

  • edited July 2022

    Fine

    ObjC is not deprecated by your definition either but don't need to wait for Apple to take that formal step to understand the point

    I'll add that most new strategic platform feature launches are now SwiftUI-exclusive. We aren't seeing UIKit/AppKit-exclusive launches of substance anymore, and are seeing Apple confidently requiring SwiftUI adoption in platform innovations. The UIKit/AppKit improvements in my view have been largely where SwiftUI parity is needed

    In other words: velocity remains, acceleration negative

    Btw I'll put my far-out guess here that the next strongest signal will be exclusive SwiftUI operability for realityOS (in other words, UIKit/AppKit left behind on new platforms)

  • Go back and check the links I posted above. Check for the section on Desktop-class iPad apps. That's all UIKit.

    https://developer.apple.com/documentation/uikit/app_and_environment/building_a_desktop-class_ipad_app

    This would be the most important new direction for iPadOS. All of the new navigation and "multitasking" features for iPad are UIKit.

    The only thing I know of that is a SwiftUI only newish addition is WidgetKit. That kinda makes some sense as a choice because widgets fit the architectural scale and limitations of SwiftUI fairly well.

  • @wahnfrieden said:
    Fine

    ObjC is not deprecated by your definition either but don't need to wait for Apple to take that formal step to understand the point

    I'll add that most new strategic platform feature launches are now SwiftUI-exclusive. We aren't seeing UIKit/AppKit-exclusive launches of substance anymore, and are seeing Apple confidently requiring SwiftUI adoption in platform innovations. The UIKit/AppKit improvements in my view have been largely where SwiftUI parity is needed

    In other words: velocity remains, acceleration negative

    Btw I'll put my far-out guess here that the next strongest signal will be exclusive SwiftUI operability for realityOS (in other words, UIKit/AppKit left behind on new platforms)

    I see your point and agree.

  • The emphasis on SwiftUI at WWDC was kind of blunt. I wouldn’t ignore it.

  • @realdawei said:
    The emphasis on SwiftUI at WWDC was kind of blunt. I wouldn’t ignore it.

    Write a metal based spectrum visualizer in SwiftUI.

    The emphasis on SwiftUI at WWDC in the non-technical talks is basically Apple trying to convince devs to use SwiftUI to write iPhone apps with instead of React Native or Dart/Flutter. I'm not going to use either of those two to write a metal based app either. At least at this point, SwiftUI is not appropriate for entire classes of applications. You choose the tools to do a job. You can write what are basically forms based web-like apps in SwiftUI. You can't write a 3D game in it.

  • @NeonSilicon said:
    https://developer.apple.com/documentation/uikit/app_and_environment/building_a_desktop-class_ipad_app
    This would be the most important new direction for iPadOS. All of the new navigation and "multitasking" features for iPad are UIKit.

    This is what I referred to as essentially UIKit reaching SwiftUI parity on new platform behaviors/functions

  • @wahnfrieden said:

    @NeonSilicon said:
    https://developer.apple.com/documentation/uikit/app_and_environment/building_a_desktop-class_ipad_app
    This would be the most important new direction for iPadOS. All of the new navigation and "multitasking" features for iPad are UIKit.

    This is what I referred to as essentially UIKit reaching SwiftUI parity on new platform behaviors/functions

    I don't follow what you mean. These are things that are new to iPadOS that are only possible using UIKit. This is UIKit well ahead of SwiftUI.

Sign In or Register to comment.