Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

What kind of music interface are you missing?

edited October 2013 in General App Discussion

There is so much unexplored potential for how to play music expressively on an iPad or iPhone. The use of 10 touch inputs plus the
Three-axis gyro,
Accelerometer,
Proximity sensor,
Ambient light sensor have so much potential that very few music apps have really innovatively made use of.

What would you like to see explored? Specific instruments you wish were emulated better? Ideas for using an iDevice more innovatively?

I think we have the ears of some interested and very capable devs here, so lets give them our best ideas.

«1

Comments

  • I'll start.

    Why in the world can I play a race car or jet fighter game that responds instantly to gyro and accelerometer movement, but hardly any music apps make use of it?

    Is Thumbjam the only one that really does this well?

  • An alternate question I'd like to ask: what are today's best iOS interfaces for playing:
    1) Piano (melodies and chords)
    2) Guitar (melodies and chords)
    3) Violin and other bowed instrument
    4) Flute and other wind instruments

    If we could create a suite of apps that combine the best interfaces with the best sample sets, and throw in ACP + AB + IAA + full-featured MIDI, we'd get virtual instruments that sound and feel awesome to play.

    And then we'd have the very interesting ability to cross-connect them via MIDI, so use a flute interface to play piano, a violin interface to play guitar, etc.

    Is it just me or would that be pretty effing cool? :D

  • Some of the TC-11 presets include gyro control. I'm still not getting much useable music from TC11, but it's fun to play anyway.

    I would like to see more customization of playing controls. As an example there are a lot of good apps for playing thumbjam or sampletank through midi, but I would like to be able to put any chords or notes anywhere I want on the screen with my choice of button size and shape. I would like to be able to instantly switch between my different customized layouts on the fly within a semi-improvised performance.

  • @rhism, I love how guitarism works for playing chords, both strumming and finger picking. Top job on emulating the UI of a real guitar. To be honest, I don't have a need for it because I play acoustic guitar into my iPad with an audio interface. But I really think you succeeded in making a very playable, great sounding guitar chord app.

  • Strumming and fingerpicking - Guitarism.
    Anything requiring runs, bends and slides - Geo Synth, iFretless series.

    I tend to use piano keys for chords until I've settled on what I want to play and by then I've learned the part, so I don't often program up chord pads.

  • @Hmtx Thanks for the kind words - guitarism's interface is just my personal interpretation of the chord+strumming paradigm that I believe was originally pioneered by iShred and then iterated on by everyone else.

    @PaulB Can you clarify what you mean by using piano keys for chords - do you play piano parts on a regular piano-style keyboard on screen?

  • edited October 2013

    Both. if it's two handed rhythmic comping, I'll do it on my full size keyboard, or indeed if that's where the idea started, but if it's just adding a few synth pads I'll use the keys on the touchscreen for convenience. I like to be able to try different inversions to get the voice leading I'm after, plus I often try odd chords I haven't bothered to work out the names of. Assigning each chord to a button or pad interferes with that process. Plus, with assigned chords, all the notes in the chord retrigger on a change, whereas I may want to sustain a couple of notes while changing another in a more contrapuntal style. Piano keys allow that. Having said that, I have several times added string parts using the Garageband smart strings chord pads with preselected chords just because I like the sound, but they're a pain in the arse to play, frankly.

    I'd be quite interested in a portrait format interface that had piano keys down one side and a 6 string strum/pluck area on the other. Played a bit like an accordion, up to 5 notes played on the keys with the left hand would be triggered by touching or strumming the strings with the right. Any string not allocated a note would be silent. That would give me the chordal flexibilty I like with the struming and plucking feel of Guitarism. One other feature could be the ability to lock a chosen note to a particular string, so that string would always sound that note when plucked and would not be reassigned by keyboard changes. This would facilitate simulation of open strings and drop tunings, etc.

  • @PaulB Interesting thoughts wrt strumming chords constructed by piano keys. Have you used Thor's strumming interface? That's kind of how it works.

    Also really like the locked-note string thinking - that's something I've been wanting to do too.

    All good fodder for guitarism 'composition mode' :)

  • Great stuff @paulb

  • I have tried Thor's strum feature. It's useable, but not in the right place and too small to function in the way I have in mind.

  • edited October 2013

    Interesting topic. I agree others on Guitarism. Very nice interface for a six string emulation. My only wish is that it had an additional control for a palm mute. Maybe a setting that allowed the current chord to ring out until the mute button (right below the chord buttons) is pressed. There is something not intuitive to me about lifting to mute.

    Also, I like the way in Cubasis they have the configurable chord buttons over top of the keyboard. It allows you to play some chords made up of notes that you can't even see on the keyboards current scroll position. It would be interesting to expand on that concept. E.g. What if each of those buttons could not only be configured to play a specific chord, but maybe also have its own mini-arpeggiator. So one button maybe just plays a straight triad, the next button plays the same triad but arpeggiates it upward in 8th notes, etc.

    I think the touchpad concept in Magellan is interesting.

    I really like Animoog's keys.

    TC11 has some interesting possibilities, especially if it were able to be used as a controller for another app. Like @Blackhaws, I haven't necessarily been able to compose music with it so much as have fun making interesting noises.

  • I think we are missing a complete DAW solution all the ones on offer have something key missing that another has and vice versa.

    That's what I'm missing, a complete solution to pull everything together......maybe in another year or so once the tech and devs align

  • I want, NEED, a direct brain interface so I can get the music in my head out into the world.

  • Lots of old living room style organs had a very clever accompaniment technique that I'd like to see in iOS. On the left side of the keyboard were the bass/chord/accompaniment notes. Usually as a standard keyboard, sometimes accordion style. This would play either a separate bass/chord sound or in some organs a whole Casio style rhythm section.

    They defaulted to major so if you played F you'd get an F major chord/pattern. The clever bit was that if you played an F plus any note above it, you'd get F minor. F plus any two notes above it and you'd get F7. Plus three would get you Fm7.

    Some organs also included a 'minor' bar that was sorta like a space bar that ran the length of the accompaniment section. Pushing your palm on it would change the chord to minor so one finger major/minor and two finger 7/m7.

    Takes a minute to get used to but very quickly you could play reasonably complex progressions with a few fingers and very little thought beyond root note.

    I could see something like that in an app and/or via midi interpreter where you split your external synth and set a section to work that way.

  • Thanks a lot, @TedBPhx, thanks to your suggestion the Ipad-13 will probably be drilled into our skulls!!! Physically, not metaphorically.

  • If it's drilled into our skulls we won't need a 13" interface!

  • I could use a midi keyboard. Yes, a simple virtual midi keyboard with a hold button to use with apps like Cube Synth, Addictive Synth, Wavemapper and others that aren't into the concept of hands free parameter tweaking. This keyboard would not be attached to any synth engine and would add practically no load to the CPU. Importantly, it would be Audiobus compatible so you could quickly switch around and hit output record buttons and the like. It would ideally be a clean, flat design, not skeuomorphic or reliant on hd images and thus very small in file size. It could get really fancy and allow scale collapse like the Thor synth keyboard, in which case it would have every scale known to man like those in Thumbjam and be user editable. It could even have an upper and lower board. Don't forget the hold button. You could name it Midi C-Clamp or Brick in honor of the old hardware solutions. Thank you in advance.

  • "Midi c-clamp" for the win. Though I use tape. :)

    This might be what you're after: http://secretbasedesign.com/apps/midicontrol

  • For some reason I have not thought to mention this before, but one of my side projects is messing around with motion capture and gesture control. I have played with a few commercial sensors, and built and coded a sensor stream processing system myself.

    It is an interesting journey. And barely started. But already some Human Factors effects are becoming apparent.

    For example, I am very doubtful that non-contact gesture control will ever be successfully applied to message level MIDI translation, a la the Leap Motion Gecko project. Or Leap Motion itself, which I expect to fade away in a year or so. My reasoning here is that this kin of control requires high precision in the motion with no extraneous movements not intended to be interpreted. And the Human part of that that leads to fatigue is the absence of any physical feedback to the senses of touch and the body's expectation of resistance from something to use to counter the inertia of a sweeping movement. Accomplished Mimes may fare better then the average person here.

    OTOH, I have a clear existence proof of non-contact motion sensing being very well implemented when the Human is dancing, and not trying to be anal about MIDI messages. This is achieved by a project I was involved in several years ago, but which is still very actively evolving down in LA as part of the Vortex Dome venue; a system called LightDancer. The dancer dances on a mat with a number of IR sensors and colored LEDs embedded in its periphery. These are illuminated by an overhead IR lamp. The sensors detect shadow edges crossing them and map that to MIDI. But then it gets very interesting, because that MIDI stream is used for some pretty complex gesture recognition, not directly applied to instruments as most mocap systems attempt to do. I can't go into details, but essentially the motion is analyzed and used to parametrically control mini-sequences. One autonomously adapts to the experience as being one of immediacy within a few minutes, via a non-explicit (and happy accident) form of aural bio-feedback. The system is venue targeted, although a commercial variant is in development. It has been well received in a night club setting. No experience required to dance on it, the result is always musically interesting.

    http://lightdancer.net/

  • @smeeth - MidiStudio, a bit heavy on the memory but light on CPU, and I was going to check Midi Designer for you but it seems broken on iOS 6 / iPad 3 at the moment. No editing or setup buttons.

  • Very cool looking project @dwarman.

    Theremins of course come to mind as a non-contact gesture based instrument example but I think to your points, very few actually mastered that instrument.

  • @syrupcore - I have seen several touch based - and other novelties - come, and then go, because it took a performance maestro to use them effectively. Very nce thngs too. Like, Jimmy Hotx's Translator, which he showed at NAMM 1991 played by Ian Anderson (IIRC). Estimated retail price tag of $10K. But he looked ahead as time went by, and the basic softwware part is available as a PC program today. It is also used as a LightDancer component, along with MAX and Zuma. Another one that gets very little following is Charles Lucy's guitar fretting, (lucytune.com), which puts close groups of up to 5 frets where normal guitars have one. Charles is the microtuning guru, as far as I am concerned. But I could not for the life of me get anything remotely musical out of his guitar. I am far from maestro level.

    I'm a little worried the QuNeo may suffer the same fate, mainly because it is so expressive with so many configurable options there are too many decisions to make to set it up to closely match one's own needs.

  • Jimmy Hotz, not Hotx.

  • @dwarman I was on the Kinect launch team at Xbox before I started app dev. Totally agreed wrt non-contact gesture control requiring a much higher-level of abstraction to work.

    Even with contact-based interfaces like a touchscreen, the lack of haptic feedback or surface texture of the glass creates similar problems when you're trying to control discrete-valued variables (less of an issue with continuous variables such as fretless pitch bend). As a result, edges are better than the center of the screen, and corners are better than edges. That physical anchoring is important for your body as it's interacting with the controller.

    Another way to say it: a discrete button in the center of the screen needs to be much larger than one in the corner of the screen, to provide the same level of anchoring (which affects precision and comfort) to the player.

  • edited October 2013

    @dwarman, I gotta say that I disagree about "non-contact gesture control" being impractical for MIDI. I mean if the latency is small enough, you could just learn how your movements shape the sound. Those of us who perform with an acoustic instrument/ vocals into a mic already do this. Distance from and position of the mic are part of the performance.

    I think a musician could do the same thing with hand movement.

  • Price and availability can also affect uptake of a new instrument. The Ondes Martenot is completely tactile, but that was taken up less than the Theremin.

  • @Hmtx but imagine playing an 'air acoustic guitar' where the frets and strings are just virtual areas of the air around you and you have to put your fingers in just the right spot to pick certain strings or fret certain frets. Your hands would have to move like a robot in order to play anything with intention, and you'd get tired really quickly. That is what @dwarman is getting at - a purely contact-free instrument is fatiguing.

  • @rhism, yes, but also imagine a guitar performance where I play my guitar but certain effects are morphed as I raise the neck of the guitar, and other effects as I bring the face of the guitar varying degrees toward horizontal.

    So I agree that gesture control can't be the instrument but it can improve it, no doubt.

  • edited October 2013

    @hmtx Hey! i do that with my electric cello!! i can not only control sound effects and add sounds... but also control lights and video projections with the movement of my cello.

    i did a talk at TEDxBerlin 3 weeks ago in which i explain this project and how it works. i hope i will have the video soon.

  • @Hmtx Using gesture to control effects knobs could definitely work. If the movement creates subtle changes in sound that would probably be cool and potentially very useful. But if the results are very exaggerated then the guitarist would need to stand very still when he/she doesn't want the effects to change e.g. during a specific section of a song. Most of the prototypes I've seen that map gesture to effects tend to be over-exaggerated (perhaps because it makes for a better demo video) which IMO wouldn't be something a guitarist would be able to use for very long, but tone down the sensitivity and it becomes a useful way to add expression to a performance.

Sign In or Register to comment.