Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

AudioKit Keynote Address at Audio Developer Conference on the future of iOS Audio Dev

edited December 2018 in Other

AudioKit keynote at the 2018 Audio Developer Conference in London a couple of days ago is now online. Inspiring talk on audio development for iOS and beyond:

https://audiokitpro.com/adc-2018-keynote/

Comments

  • Definitely gonna watch this.
    Just been caught up in Black Friday Madness the past couple of days.
    :)

  • edited November 2018
  • edited November 2018

    Hate when they interject the political commentary : Both him and the announcer .. “Used to be a democracy”—Yeah, okay.. Enough already .
    #notHerePlease

  • Wow, just saw the whole thing.. Who knows what the world at large will come up with now? I never even bothered to check out “D1 because who needs another D50 emulation? Boy, was I wrong! What an amazing synth! Not convinced w the microtonal thing though. I used to live in the same building as one of New York’s leading microtonal bands and that was 30 years ago. I don’t really think it’s gonna catch on.. Brilliant presentation otherwise. My favorite part.. when he put Reason up on the screen.

  • Thought provoking talk about a truly inspiring project! Congrats to the whole audiokit team @analog_matt ! Keep up the amazing work!

  • such a cool project

  • Really nice commentary on the keynote by @Enkerli

    "There really is something happening in the AudioKit space. Aure’s talk was inspiring at several level and might be a bit of a pivot, in the audio development world.
    Dunno how formal the relationship with SOUL will be, but that’s quite intriguing in and of itself. Makes me want to watch the announcement.

    There’s the issue of platform support. It sounds like people in the room may have quickly dismissed AudioKit because it’s so iOS-centric (and, in fact, iPad-centric). But Swift can work on other OSes. One of JUCE’s big strengths is in cross-platform support. Maybe there’ll be a way to leverage AudioKit frameworks in JUCE, on diverse OSes?

    The part about microtuning was quite interesting, to me. It may not have broad appeal, which might be the reason that Synth One tuning panel isn’t in D1. But it’s the code sharing kind of thing which can really deepen the relationship among music apps. What Aure announced sounds like the beginning of a kind of Ableton Link for tuning! That could be just awesome.

    But the part about this which makes it especially intriguing is the fact that it specifically connects through MPE support. Now, that is an interesting prospect, especially given the whole section about Linn’s advocacy for expressive electronic instruments.

    So far, there hasn’t been much MPE support in AudioKit apps. mTonal is an exception. And getting a wider range of MPE controllers can really open things up. While KB-1 is probably not built in AudioKit, it’s another part of that whole puzzle. Especially because it’s available as AUv3.

    Which begs the question: will the addition of AUv3 to AudioKit synths spread quickly? Devs complain about how clunky AUv3 can be. Yet, once people get good code in AudioKit Synth One, could we get a whole lot of new apps adding AUv3 support? Including AU MIDI?

    Sounds to me like the puzzle pieces are falling into place. MIDI-CI is a big chunk missing, and we don’t know how it’ll pan out apart from reading the specs. As we integrate diverse apps and devices, more and more, there’s an opportunity for something pretty big in the development of musicking apps.
    With or without fastfood placemats. "

  • @analog_matt said:
    Which begs the question: will the addition of AUv3 to AudioKit synths spread quickly?

    How far away is AUv3 support in AudioKit? Just curious. Once there is it easy to add to FM Player, D1 and Synth One? Or just possible with a lot of effort?

  • edited December 2018

    @McDtracy said:

    How far away is AUv3 support in AudioKit? Just curious. Once there is it easy to add to FM Player, D1 and Synth One?

    There are still some bugs to work out. However, hopefully it won't be a huge undertaking to add to current & future apps once all the kinks are worked out:

  • As @TheDubbyLabby reminds me, some of the futures are likely to lie in embedded devices. Since Swift already runs on ARM (and x86) and there’s experimental support on Raspberry Pi, there might be a future where the same Audiokit code could be used between iOS and Single-Board Computers. In other words, maybe we could eventually have hardware versions of Synth One, D1, etc. In this case, the focus might not be on GUI and the simulated knobs/sliders could be actual pots and sliders.

    The MODEP setup is an interesting version of this. It’s an emulation of the MOD Duo pedal on a Raspberry Pi with a Pisound HAT (like a soundcard). The Pi is headless (it doesn’t need a screen): it runs a WiFi access point and a Web server so you can create virtual pedalboards from any Web browser, including on a phone or tablet. The plugins for those virtual pedals are LV2, which is a different system than VST, Audio Units, or AAX. But it’s pretty much the same idea. There are some quirks, but it works well. Used it on several occasions, including in demos at museums.

    There are hundreds of LV2 plugins, but the breadth isn’t really there and many of them are kind of subpar. There are some pretty cool ones, though nothing which would make people switch from VST/AU. Several do work pretty well as desktop versions on macOS or Windows, though. Haven’t noticed any which also exists on iOS.
    The dream would be to have the same plugins (and patches!) running on embedded devices, desktop platforms, and phones/tablets.

    There’s something really special about having a small hardware device which, conceivably, could run any softsynth or effect imaginable. It’s a bit hard to explain the difference with “working in the box”. Some of it is psychological, which is to say that it works because of the way the human brain works. But it’s also about some very practical things, like consistent experience, not having to look at a screen while playing, and offloading processing to diverse devices. In some cases, it could also mean that you have more direct/continuous control (say, CV) than using MIDI CC. My own dream is to have a breath-controlled lowpass filter working through this kind of direct connection. Haven’t really had issues with “stepping” or latency, with the MIDI CC version. Yet there’s something very subtle which goes on when you have direct control of a filter. Your brain may pick on it and give you a sense of satisfaction which isn’t connected to your “conscious ear” being able to tell the difference.
    There’s also something really important about direct feedback, including tactile. Coming from a saxophone background, the sensory experience can actually be about feeling the physical vibration through your whole air column.

    So… We might be a long way from SOUL-enabled devices pushing the processing so far along the chain. But it’s still fun to dream up the possibilities.

Sign In or Register to comment.