Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Full Moon over Central Park

Comments

  • I love the beautiful simplicity of this. The liquid Fender Rhodes and the acoustic bass. The sax is so restrained and comes in at just the right times. The subtle key changes starting at 4:29 are really nice.

  • Thank you. Would you believe I didn’t write or perform this, but I wrote the code that did?

  • @TheOriginalPaulB said:
    Thank you. Would you believe I didn’t write or perform this, but I wrote the code that did?

    That’s cool. I was a software developer for many years and I just recently started making music. I would be most interested in the process.

  • Essentially it’s a set of 8 software LFOs that provide values that can be combined in different ways to control the pitch, duration and velocity of notes on up to 4 MIDI outputs, plus the duration of rests and longer gaps in each stream of notes. The LFOs can be configured with waveshape, frequency, amplitude and bias and there’s separate control over tempo and base pulse division, with the whole lot passing through a variable scale filter and key transposer before being output to the MIDI channels.

  • Very nice. Reminded me a bit of Ooda/Riffer generated music in place but there seems more to this piece, especially when the sax comes in. I enjoyed it.

  • I think it betrays its origins, largely because there are no stylistic cliches in the sax playing…

  • I would be chuffed if I’d written sw to produce this, it’s an enjoyable meander. The key change is sweetly dropped. Wonder what it would sound like if the midi was sent to, say, borsta and skaka? 🤔

  • edited May 2023

    I have neither.

    I’m now thinking of ways in which to guide the meandering in directions that follow human musical expectations.

    Things to address:
    There is no thematic material
    There is no harmonic progression or cadencial resolutions

    If I solve those issues, then the current code can be adjusted to drive variations in the themes and harmonic progression.

  • Sort of variable thematic recursion based around variable scale positions?

    Music (imo) is an expression of feeling, the intention of an improvisational player also has it’s own adsr envelope, each expressed ‘chunk’ of a piece being it’s own wavelet of communication, much of this being as tonal and dynamic as it is harmonic, thematic and rhythmic, and the overall piece somehow expressing the sum of all these ‘wavelet chunks’.

    And of course, the less you notice the player’s mechanism, the more effective the conveyance of feeling.

    Interested to hear what you come out with though, very difficult to reduce musical experience to algorithmic data.

  • Nice bit of coding… interesting to see where you go with this project 👍

  • Nice coding for a nice output.

  • Very interesting. I hope you continue and share its evolution.

Sign In or Register to comment.