Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.
Download on the App StoreAudiobus is the app that makes the rest of your setup better.
Comments
"Vibe" is absolutely a thing. Not in a general "vibe out bro" sense but in a personal, "I vibrate differently when I play my ______". As is the creative power of constraints—no MIDI means no wasted brain space on editing, automating, etc. I don't think it's funny at all that you and your Rogue get on real friendly like.
Well said. A MIDI File is just sheet music in a different format. Recording MIDI is not dissimilar from using a pencil to write notation on a score — you're free to edit. It's benefits over traditional sheet music are many but the main ones to most (speculation alert) are fairly incremental: it's a hell of a lot easier to rearrange stuff and the resolution/representation of any note in time can captured in much more detail.
Dan Baker, the Garageband video guy, just did a discussion video on playing by yourself with software versus playing with other musicians which touches a bit on this thread. If you've never watched any of his videos, you're missing some serious musical insights.
As @Linearlineman discovered buying a controller that feels like the real piano helps combat this effect.
I found out last night that the weighted keys on my controller is awful for pushing out complex chords. The iPad screen keyboard is faster and feels more natural for these "one finger" creations.
It's like the feeling a great acoustic guitar creates... nothing is like it.
Still, saving notes played and being able to edit them and use them like cuttings from a magazine to make collages.
It's all empowering, huh?
That’s a deep question for any musician.
Does a composer whose music is played by an orchestra makes its music? Or is it performers?
I think it has to do mostly with ego, how we think music should be, personal (and collective) creative concepts, human vs computer, our musician place in the process.
I agree with @TheDubbyLabby with past and future fighting for present control. We should have opened mind, music has to be creative, it can’t be played forever by dinosaurs. There is no thruth, except perhaps for the musical creator and how he sees its own creation.
Midi has nothing to do with randomization. It’s a tool used by composers too, even classical ones. You can write in midi, you can perform in midi. So it can be live performed instrumental music, played by one or several musicians. What we can loose is real instruments lifefull acoustics, human touch. This is why modern sampling methods with multiple articulations and round robin features, or MPE are so interesting. But when midi is performed by a human musician, even a purist should agree the magic sauce is there. No robot involved.
Now there are different music forms. Collage, sampling is an art form. Deeply artistic and technical. A instrumental purist should try to do some beat making too, good luck to do something which sounds nice at first try. This has to do with concepts and ego. What a sample can add to human perspective? Well, if you use an Indian sample or strings from Curtis Maylfield, it will add the sound and soul from that musician or place, from History to your music. This is not from you, this is from world. This is concept, this is urban music concept: capture sound, spirit and moods from all around of you. This is creative, this is genius. Manipulating, chopping samples, playing them with midi controllers is music, this is human performing. Electronic music has its gurus at doing this art. I’m sure lot of musicians never thought about that.
And now there is sound design art at production stage: shaping sound, creating grooves with effects, deep ambiant or dark moods, this involves very technical skills, knowledge, taste, ears... this is like painting with brushes, if you don’t know where you go, it will sounds wrong, best producers will sounds like no others with a very personal color. Ok it can be less real-time performance, but is classical composers real-time? Same process, from brain to sound!!
What about generative music? Well I didn’t have any attractiveness for this one for a while, and really discovered it with IOS and Rozeta. I now use it very often in my creative process: it adds lot of movement and life to the music. I can do a track using mostly those instruments, or only one or two elements, or not at all... It’s like sampling, this sound is from elsewhere... and this can push your creativity even while performing your instrument. When randomization is all other the place, it can be disturbing, but you can program things. Like anything in music, it’s all about cycles, repetition and modification. You can control some things, like the cycle lenght, use follow actions with a dominant section played more often with less random factor. This random « mod » knob is what also make things interesting from a player perspective: the instrument part evolves, never plays exactly the same thing even if there a reproductible scheme. Isn’t it like a real musician, say a bassist or drummer? Algo can adds dynamics accentuation, slides, note randomization... this is huge and really reminds me human playing, I mean like band human playing. I really feel it when playing with those robots instruments, from an acoustic instrumentist and improviser perpective: it surprises me and push me to improvise in consequence. I’m also a purist, and I will always use everything of this to give musical elements to my main artistic focus which is sax playing. But really those new instruments (there are not only tools) really adds something to midi playing and brings life to music. Fun that a computer can give life isn’t it? But I also think it’s great to choose as a human how to use it, when to use it, in which proportions. The same way a composer asks to a performer to play this or other way. Don’t let the robot create everything, it can be fun, but it’s not as creative, we can loose at this game. At the end, this is ego and control. Music is about loosing control too, we should have open minds. But in the process, there is creator intention. If at the end of a creation you can say to yourself: this is me!!! Then, it’s all good.
Yeah basically you just use the tools that you require to produce the music that you want. Simple. Straightforward. Nothing to worry about. Just get the job done.
I think you're approach to music opens a lot of doors for me. I can see why a lot of great musicians became very spiritual in the end game to deal with the internal conflict of the ego and control. I find my mind going to an observers role too often and actually wandering to what some particular audience might think of something I'm creating in realtime. Loosing the self to find something closer to a pure self is the end and purpose of art.
Thank you for documenting your views with such precision. Robot it a loaded term. It is other and yet it's probably just me in the mirror. Who is that? It's not me but how an other would see me. Damn, I've really let "me" fall apart. I look like my father did and that did not end well. I am not ready for that. Too much I'd like to do yet.
@Janosax @McDtracy
This is what I called self-satisfaction which even could go further into Meditation or Self-realization. Sometimes I need to play for channeling my true Self into this Matrix we call Reality... but deep inside I feel the Truth where the infinite realm open its doors and there is no me and the world anymore since Life is One by definition and connecting to IT is the most living experience we can achieve.
There are other forms to enter that zone but since Universe is vibration I feel it as the most wonderful (in Harmony terms) being laughing the fastest (it's a fact studied by Laughing University of India) so no best or worst in the realm of Joy...
I hope you get where I'm pointing...
Can you feel it?
@McDtracy this reminds me a french music theory book I’ve read at my music beginning: music is about pitch, duration, dynamics and tone.
I think it’s musician freedom to emphasizes any of those elements if he wishes to and that makes sense for him. Percussions music don’t really have pitch, but isn’t it music? I know you like Pharoah Sanders, all those New Thing musicians/instrumentists playing was often more about sounds solo that notes solos. They didn’t care about opinions, and it was certainly a very rich art form, expressionism one. This was already back in 60’s sound texture exploration, like we can do today in more contemporary electronic music. And breaking musical rules was all about breaking societal walls too, so it was a powerful art too.
Yeah I get what you mean. Enjoying the process is more important for me than the final product. My creative process rely a lot on sound and texture shaping, because this is realtime relation between sound and me. This is relaxing and meditative for sure, and I can push lot of myself in my creation that way, much more than in a piano roll. I often create moods or grooves which give me some inspiration for instrument playing and improvisation. This is how I validate a track, if I can play something good or not with my sax with those sounds. I think it’s like enjoying the whole life path, not staying only focused on carrier etc. Life philosophy and art making certainly form a nice combo, or perhaps at the end this is one and only thing
@McDtracy this remember me Ornette Coleman and its Harmolody concept, like in Free Jazz album. Every single musician was playing a solo at the same time. While giving all their individual musical expression, they were listening to each other, giving that sense of harmony and beauty in the chaos, though individual melodies. How in a society can we do that? Each one can find its way, express him/herself fully, in collective harmony thanks to open and benevolent individual minds. This made lot of sense back in 60’s, Archie Shepp worked at university for years, teaching revolutionary concepts in African-American music of 60´s. Did you know that Ornette Coleman had teeth broken in a fight/agression by mens who can’t stand he was playing a plastic saxophone? And you need your teeth to play sax... How stupid individuals can be, it’s even worst when it’s combined with collective closed rules and formatted thinking.
Let me try again...
Get just the first part of the dialogue until they start talking about honor and so (which is interesting but not related at all with what I try to point...)
@McDtracy is pointing at the same direction... I just try to make you feel what's all about...
Robots as tools only get instruments not when humans use them per se but when something more subtile get involved into the equation. The day something different a human reach that point we could start talking about true Artificial Intelligence.
Amen and amen.
Like Michael Jackson said:
@Janosax That's it!
@Lineralineman - You want Robots? Be careful what you ask for.
@McDtracy, which of the robots was you... I mean who was the robots' god?
I must say this thread is a lot more philosophical than I thought it would be. Which is fine by me always. On the other end. > @lukesleepwalker said:
Well, that is absolutely correct in the present moment, but it does not project how we want our robots to act in the future.
I am so ill educated that most of the things I want are probably already available and I just don't know how to utilize them. For example:
The ranges of the actual instruments and their midi interpretation does not coincide. In other words, a trombone patch will play a C6 or whatever even though it is out of its normal range. I would like my robot, when it sees the note is too high, to drop it a number of octaves( or even partials), defined by my input. I would like to be able to touch the screen and do that. Is it possible now? I know there are transposition tools, but if I lower the top notes the bottom notes get too low. Anyway, I would like this option. ( please tell me how to easily do it!) But this is just an example of how the robot could be better for my needs. I do work with the tools as they are and I think I am more (Un)conscious in my improvisations to compensate somehow. I always seem to work around the limitation but I would prefer if this were not a limitation, however. This possibiliypty might exist, but other useful stuff probably does not, especially for advanced users.
So, what I had partly in mind when I posted this thread was....
What developments re midi and the robots that plug into them would folks like to see in the near future to enhance the result or make the workflow more efficient?
And secondly, what developments in the last year or two have made the robotics better?
You might want to check out the StreamByter app thread. You could create a system where you have a stream of notes and then it could create different parts for the different instruments in your robo band by turning them into chords, percussion, bass lines, melodies for specific instruments, add strumming, and random variations. It could become a MIDI LEGO band construction set. The developer is very helpful so you might want to go to the app’s forum and let them know what you’re hoping to accomplish.
Thanks @infocheck. I will look into it. Expect an angry pm from @McDtracy as I will probably bombard him with questions about it... which he loves.
I don't PM, I sulk.
Many sample instruments (like the iSymphonic's I think) limit the notes playable to match the capabilities of the original instruments. I know SampleTanks dreaded Miroslav does.
Robot is a loaded term: it's generally a fixed function hardware designed to replace humans in work roles. What we're doing is just making more elegant instruments to extend the abilities of humans - Computer Aided Music Production (CAMP) {I made that up}.
The "Me" in my soundcloud project is the lead line not play with perfect precision on funky Electric Piano part.
Anything you can carefully describe could potentially be coded but who would be foolish enough to write code for one person without some kind of contract for payment? Sorry.
Programmers write their apps hoping to get paid but as Michael indicated in his interview with Matt. It takes a lot of luck to make more than a minimum wage. Sad but the life of the artist has always required patrons or independent wealth to sustain the life of the artists.
Some lucky souls sell their work.
I’m a total junkie for those creative clusters. Luckily they’re not the first and hopefully not the last. When I stop seeking them out, that will be the time to wrap up this wretched existence
that's as good as it gets with any recording tool(s) - and it's cool.
you can still revitalize the moment or alter it, regardless if you mod midi data or pass a sheet to fellow musicians
@LinearLineman
Here's a small loop of ROBOT (human assisted) Jaaazzz
Since I know you appreciate both I'm posting it here.
Interesting laid back robots @audiblevideo. @McD should have a listen, he'll give an exact genre name to it.
Nice jazz robot vibe there @audiblevideo.
Jazz is not dead.
It just smells funny.
FZ