Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Fugue Machine / PolyPhase (beta) live jam (w/ StepPolyArp, Rozeta, iGrand, Quanta, SpaceCraft, etc.)

Live performance using Fugue Machine and PolyPhase (note: this is the 1st beta of PolyPhase).

These two work very nicely together. Fugue Machine is a more precise "performance sequencer" but PolyPhase is a tremendous idea generator. This is with the 1st beta; looking forward to trying out the update and the ultimate release of this app.

Both Fugue Machine and PolyPhase are playing instruments directly as well as triggering StepPolyArp Unit, Rozeta instances and Mersenne's arpeggiator for additional variation (Finally updated past iOS 10.3.3 so I could play with the MIDI AUv3s!).

Sound sources are iGrand, iSEM, Quanta, SpaceCraft, Laplace and Mersenne.
Effects include DLYM, Reverb - FDN, Discord4 and RE-1.
Hosted in Audiobus and AUM.

This is a slow build... give it time....

Comments

  • Oh hell yeah!
    That was awesome. B)

    Really nice touch with FM. The live playing was really good.
    Sounded quite glorious with everything going. Then the way you strip it all back for the ending was really well done.

    How did you film this, with 3 camera setups? That part was awesome, too. Great job on the video.

    Five Stars!
    :)

  • Very great video and music man ! I've never thought to have fugue trigger spa or rozeta , great idea.
    Deff gonna jam this a few times

  • @CracklePot said:
    Oh hell yeah!
    That was awesome. B)

    Really nice touch with FM. The live playing was really good.
    Sounded quite glorious with everything going. Then the way you strip it all back for the ending was really well done.

    How did you film this, with 3 camera setups? That part was awesome, too. Great job on the video.

    Five Stars!
    :)

    No kidding.. I can't get one damn camera to sync to audio... Getting more than one to sync to audio AND get them to sync to each other... That's witchcraft I can't even begin to imagine...

  • Great work man! Really nice.
    Do you really trigger SPA with Fugue? And then out to what?!

  • Excellent stuff!

  • Yeah I enjoyed that. Good work 👍🏻

  • Very nice, and I love the information of how and with what. :)

  • Thanks everyone!

    @MonkeyDrummer said:

    @CracklePot said:
    How did you film this, with 3 camera setups? That part was awesome, too. Great job on the video.

    No kidding.. I can't get one damn camera to sync to audio... Getting more than one to sync to audio AND get them to sync to each other... That's witchcraft I can't even begin to imagine...

    Yes, it's 3 cameras, but nothing fancy:
    Top down is my iPhone 6s ("famously" used here; in that, the angle shot was from this iPad)
    Right side shot was my old iPhone 5s.
    Rear shot was from my Sony A7. In other videos I use that for the close macro shots, but here I wanted more of the space and vibe rather than a tight shot.

    Syncing and multicam editing is really easy. There are several programs and plugins that will do the alignment for you automatically, but even manually you just need to slate the recordings (snap your fingers where all cameras can see) and then set that as the common start point for all recordings in your editor. Honestly the most time consuming thing is the physical setup of the cameras!

  • @aplourde said:
    Thanks everyone!

    @MonkeyDrummer said:

    @CracklePot said:
    How did you film this, with 3 camera setups? That part was awesome, too. Great job on the video.

    No kidding.. I can't get one damn camera to sync to audio... Getting more than one to sync to audio AND get them to sync to each other... That's witchcraft I can't even begin to imagine...

    Yes, it's 3 cameras, but nothing fancy:
    Top down is my iPhone 6s ("famously" used here; in that, the angle shot was from this iPad)
    Right side shot was my old iPhone 5s.
    Rear shot was from my Sony A7. In other videos I use that for the close macro shots, but here I wanted more of the space and vibe rather than a tight shot.

    Syncing and multicam editing is really easy. There are several programs and plugins that will do the alignment for you automatically, but even manually you just need to slate the recordings (snap your fingers where all cameras can see) and then set that as the common start point for all recordings in your editor. Honestly the most time consuming thing is the physical setup of the cameras!

    What iPad was you running al, that goodness on it handled it pretty flawlessly.
    ?

  • edited December 2018

    So nice!
    That’s the best Video of generative music I’ve seen so far. Nicely done! 🎶👍

    I’m so excited to see what we can do with an iPad these days! 😊
    Amazing!

  • @Jumpercollins said:

    What iPad was you running al, that goodness on it handled it pretty flawlessly.
    ?

    iPad Pro 9.7" It's pretty flawless; there are a couple places where you can hear the MIDI chugging a bit when switching apps.

    I just upgraded to iOS 12 (from iOS 10.3.3). I realize now I should turn on "Reduce Motion" as the app switching animation is much more elaborate (and presumably resource heavy) now.

  • @chandroji said:
    So nice!
    That’s the best Video of generative music I’ve seen so far. Nicely done! 🎶👍

    I’m so excited to see what we can do with an iPad these days! 😊
    Amazing!

    Thank you!

  • @aplourde said:

    @Jumpercollins said:

    What iPad was you running al, that goodness on it handled it pretty flawlessly.
    ?

    iPad Pro 9.7" It's pretty flawless; there are a couple places where you can hear the MIDI chugging a bit when switching apps.

    I just upgraded to iOS 12 (from iOS 10.3.3). I realize now I should turn on "Reduce Motion" as the app switching animation is much more elaborate (and presumably resource heavy) now.

    Same model as mine great stuff.

  • @chandroji said:
    So nice!
    That’s the best Video of generative music I’ve seen so far. Nicely done! 🎶👍

    I’m so excited to see what we can do with an iPad these days! 😊
    Amazing!

    You should check out the previous Fugue jam posted here by @aplourde, he is very good with that app!

  • @aplourde said:
    Thanks everyone!

    Yes, it's 3 cameras, but nothing fancy:
    Top down is my iPhone 6s ("famously" used here; in that, the angle shot was from this iPad)
    Right side shot was my old iPhone 5s.
    Rear shot was from my Sony A7. In other videos I use that for the close macro shots, but here I wanted more of the space and vibe rather than a tight shot.

    Syncing and multicam editing is really easy. There are several programs and plugins that will do the alignment for you automatically, but even manually you just need to slate the recordings (snap your fingers where all cameras can see) and then set that as the common start point for all recordings in your editor. Honestly the most time consuming thing is the physical setup of the cameras!

    By sync I mean drift... I'll start on the exact frame, and a few minutes in the audio and video are no longer lined up... Apparently it has to do with variable frame rates, audio sampling frequency (48K seems to work better), sun spots, Russian hacking... Stuff like that...

  • @MonkeyDrummer said:

    @aplourde said:
    Thanks everyone!

    Yes, it's 3 cameras, but nothing fancy:
    Top down is my iPhone 6s ("famously" used here; in that, the angle shot was from this iPad)
    Right side shot was my old iPhone 5s.
    Rear shot was from my Sony A7. In other videos I use that for the close macro shots, but here I wanted more of the space and vibe rather than a tight shot.

    Syncing and multicam editing is really easy. There are several programs and plugins that will do the alignment for you automatically, but even manually you just need to slate the recordings (snap your fingers where all cameras can see) and then set that as the common start point for all recordings in your editor. Honestly the most time consuming thing is the physical setup of the cameras!

    By sync I mean drift... I'll start on the exact frame, and a few minutes in the audio and video are no longer lined up... Apparently it has to do with variable frame rates, audio sampling frequency (48K seems to work better), sun spots, Russian hacking... Stuff like that...

    Ah. Well, variable frame rates would definitely contribute to that. You want to use constant frame rates for any type of editing.

    Sample rate shouldn't be an issue as long as you're using the correct rate or the software interpolates. Frame rates also won't be an issue unless you have a mismatch (29.97 vs 30 fps) and the software can't accommodate.

    The main thing is to not use variable frame rates, be cognizant of the rates you're shooting at, and match when editing. Also, ideally, don't edit with compressed formats. Use an intermediate codec for the video and WAV or AIFF for the audio, but this is more for editing performance than sync per se.

    Honestly, with modern gear you shouldn't have noticeable drift until several dozens of minutes pass (maybe 1/10 second after 45 minutes?)

  • A couple of people have been asking me how this was set up.

    It’s pretty straight forward:
    Fugue Machine, iGrand and AUM hosted in Audiobus for state-saving.

    Everything else hosted in AUM.

    @reasOne said:
    Very great video and music man ! I've never thought to have fugue trigger spa or rozeta , great idea.

    This is one of my favorite things. Fugue Machine is an incredible performance sequencer, but with 4 playheads playing the same set of notes it is limited. By using one playhead to control the transposition of another app you open up a world of options that will still be in key with what you’re doing in FM.

    I posted some tips for “playing” Fugue Machine here

    Still using the same techniques: Blue, Orange and Yellow playheads are playing iGrand. The Red playhead is directed to Rozeta Bassline which is set to transpose. That goes through Rozeta Scaler and on to iSEM.

    BTW, when I say directed to, I should clarify that I always try to have apps send to their own named output port and then have the receivers select those ports. I find there's usually less of a chance for something to go screwy; it's more intentional to have the receiver connect to another app than just throwing MIDI at a receiver.
    Also, by sending to the app's own output you can have multiple apps connected to that output. E.g. you can monitor that output while also having the receiver app connected. If you were directly sending to an app's input, you can't monitor that; you'd have to disconnect from your receiving app and reconnect to the MIDI monitor. This is very useful in complex routings, or when you’re doing transpositions so you can verify that you’re getting the result you were looking for.

    @marcuspresident said:
    Do you really trigger SPA with Fugue? And then out to what?!

    Actually, SPA was being triggered by PolyPhase (the 3rd channel), but the idea is basically the same as above. A slight difference is that Fugue Machine is transposing the sequences playing in Rozeta Bassline but PolyPhase is directly sending the notes to SPA to arpeggiate. From there the notes go to Laplace.
    Honestly, with just a single note and the simple rhythm, I probably could have just used the built-in arp in Laplace, but I just updated from iOS 10, so I wanted to try out all the MIDI AUs! I’m definitely planning to do something more elaborate with SPA…

    Track 1 in PolyPhase is directly playing Quanta.

    Tracks 2 and 4 in PolyPhase send notes to Mersenne. Initially it plays the notes directly, but as you can see, I turn on the arp in Mersenne and build up its complexity.
    Actually, I just realized that I only turn on track 4 at the end of the song. I was meaning to turn it on earlier to build up the mid section even more… oh well…

    For the most part I’m not really using PolyPhase “right” I have it set up as 4 fixed sequences. Only at the end, during the denouement, do I turn on the generative aspect of PolyPhase to get Quanta and Mersenne moving around more as a contrast with the rest of the instruments that are breaking down and becoming more sparse.

    In AUM I have SpaceCraft playing a drone. The sample is the output of what Quanta and Mersenne are playing slightly pitch adjusted. I also have an instance of Rozeta LFO. One LFO is controlling the stereo balance of Mersenne to provide some motion; the other two LFOs are modulating the grain size and sample position in SpaceCraft.

    A bit of effects processing and that’s it!

  • It’s pretty straight forward:
    Fugue Machine, iGrand and AUM hosted in Audiobus for state-saving.

    Thanks for the tipps! Great video.
    I only use AUM so this may be a stupid question. So you can save the state of IAA apps in audio bus? Also the whole state of AUM? You don't have to save the AUM session additionally? Thanks

  • Many thanks for this great info!

  • audiobus can save the state of those IAA apps (including AUM) that have state saving. Creating an AB3 project can be very handy for any projects that involve IAA apps even when you don't actually need AB3 to do any routing.

    @JoHe said:

    It’s pretty straight forward:
    Fugue Machine, iGrand and AUM hosted in Audiobus for state-saving.

    Thanks for the tipps! Great video.
    I only use AUM so this may be a stupid question. So you can save the state of IAA apps in audio bus? Also the whole state of AUM? You don't have to save the AUM session additionally? Thanks

  • @espiegel123 said:
    audiobus can save the state of those IAA apps (including AUM) that have state saving. Creating an AB3 project can be very handy for any projects that involve IAA apps even when you don't actually need AB3 to do any routing.

    Very cool, thank you! Some of my favorite apps are IAA only

  • @JoHe said:

    @espiegel123 said:
    audiobus can save the state of those IAA apps (including AUM) that have state saving. Creating an AB3 project can be very handy for any projects that involve IAA apps even when you don't actually need AB3 to do any routing.

    Very cool, thank you! Some of my favorite apps are IAA only

    Yes, most of my favorite apps aren’t AUs: Fugue Machine, Quantum, Xynthesizr, Patterning, Elastic Drums, Samplr, FieldScaper, Borderlands, iPulsaret, Shoom....

    Go to the Apps section of the Audiobus website and you can search for your favorite apps to see if they support state saving, remote triggers, etc.

  • Excellent break down and info here!! Thanks for taking the time to share these secrets! True gold!
    I've never thought to have rozeta stuff trigger things other than notes cuz I'm pretty new to these kind of sequencers, ive always just played keys and twisted knobs, so this is pretty rad ideas!!
    Much appreciated, and seeing how complex the set up of this song is just makes me dig it that much more

  • how do you get fugue machine to route to the transpose in baseln ? i do not see a way to connect it in the AUM channel

  • oh man im too bad at this haha as soon as i set up AUM how you have it and pressed play in FUGUE all the sytnhs just started jamming...im in over my head hahaha

  • Ahh you have to use the synths as an au to set the parameters in aum I was using iaa

Sign In or Register to comment.