Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.Download on the App Store
Audiobus is the app that makes the rest of your setup better.
From the Mozaic Manual:
I would think someone could use these variables to make a polyrhythmic arpeggiator or sequencer based upon using the information from those variables you get from the AUv3 host app.
Have you looked into an app like Photon where you can load a MIDI file onto six different pads, route the MIDI output independently, and set the tempo division for the host for each one? The AUv3 parameters for the six pads can be MIDI controlled too.
Yes, you could make an arpeggiator like that, but what you couldn’t do is make something that speeds up a channel. Mozaic can only affect midi that passes through it. For it to, say, double the playback speed, it would need to be able to see into the future. It could conceivably slow down playback for limited amounts of notes, but variable storage is limited, so, other apps are better for that. Atom, and as you noted, Photon can be used this way, for pre-recorded midi.
Hey all, I’m fairly new to script writing of any kind and am getting most the basics figured out but I’ve gotten myself stuck and confused trying to implement the following idea...
I want to send a midi macro message to Loopy (to toggle mute/unmute of a few tracks), triggered by a pad on my external controller, which starts at the beginning of the coming next bar, and repeats 8 times (synced to the beginning of each subsequent bar), then stops/exits the loop...
As Host information doesn’t get into a loop once the event is triggered I can’t figure out how to 1)make the macro/user event wait to be sent until the beginning of the next bar, and 2)make it repeat 8 times at the beginning of each next bar
Is there a way to ‘wait’ to start a loop after triggering it with an incoming MidiNote from my keyboard pad? I understand how to delay the sending of a midi message in milliseconds with the 4th byte but can’t figure this out...
Do I need to use an LFO for this?
I feel I’m just missing something pretty basic here but mind crunching for hours has brought me to my knees asking for help! Thanks
I'm also still learning but I'll take a swing at this problem to help me test my understanding.
There are 2 ways to manage event timing:
Here's some sample code to log the time to send the event after 8 x 4 = 32 pulses.
The Metronome pulses and the counting will start when you hit "Start" in the DAW's transport controls. Until then you'll just be loaded and waiting.
Here's some sample code to log the time to send the event after 8 x 4 = 32 pulses (8 seconds).
As written here, the Mozaic clock will start running when you load the script. But you can
start a Timer in response to some external event like a MIDI Note or CC input.
I suspect you'd want to sync with the DAW's timing and BPM. I'm sure there are additional
mysteries to be solved but this will point you in the right direction, I hope.
I wrote this code without any testing so hopefully you'll see those "Log" messages as I expect based upon the DAW's BPM or the 8 seconds of Timer pulses.
I think all you need is the
@OnNewBarevent for what you want to do...
Thank you for providing a complete solution.
Thanks @wim and @McD! You’re awesome. I got it working with this help and understand something much better now!
I’d been trying to do something like imbed an event inside another one, like:
Which as far as I can tell is not possible right?
Anyways I understand now how to accomplish this kind of thing simply with variables as you showed in that little super helpful bit of code @wim. Do you ever sleep I wonder?
One question though...what’s the reason for the sendchannel, sendcc, and sendvalue variables? Rather than just entering those values into the SendMidiCC bytes?
So this is what I ended up with to also send the macro 8 times...is this actually the best way to repeat the sending of the macro 8 times at the beginning of each new bar?
The problem I see with the options you offered @McD is that the macro would indeed wait to send until a full bar after the trigger note is heard, but that would only be at the start of the next bar if I triggered it at the start of a new bar as it’s set to either miliseconds or pulses instead of the next bar...I think... 😆
That's just a preference on my part. I like having anything I or someone else might want to adjust up in the beginning of the script and in variables so that if changes are needed a) they're easy to find, b) you only have to change once even if the value is used multiple times, and c) it's easier to see what the code is doing with names rather than numbers.
Looks good to me. I like that you made a separate call for a custom event rather than embedding the code all in the loop. That also makes it easier to read and maintain.
I'd say you've got a good grip on things.
As for embedding the built-in mozaic events inside other events, as you've seen, you can't do that. They're like independent watchers, waiting for events to happen, and you can't call them. By contrast, user created events don't watch for anything at all, and must be called.
Yes. I don't even get partial credit for my answer but I suspected someone might
be exposed to the concepts and we can slowly add more novice programmer's and have them ask the good programmers here for help. It pulled @Wim into setting the record straight.
Please continue to ask questions... I know I will as I uncover some puzzle making new MIDI
scripts. Making them is really fun... when they work and puzzling when they don't. I love
a good puzzle.
@Wim has really helped make it clear that Mozaic has these event listeners:
(there are many)
And the script author can wrap up some code with a label:
Then call that code with a "call" instruction:
Someone is thinking... I got that.
A running Mozaic script is really a collection of potential "listeners" waiting for MIDI events to trigger code execution.
A question for the experienced programmer's out there.
What common programming language is Mozaic script most like in your experience?
Something with "Listeners" (Event Handlers) and "Functions" maybe?
Not common anymore, but Mozaic was designed to be similar to Pascal. Which is cool for me because Pascal (and its cousin Modula 2) was the first languages other than Basic that I got serious about learning. It has always seemed like an efficient language for smaller-scale stuff because it's easy to get a grip on.
Cool. I can see that for the easy to type syntax defining sections.
SmallTalk? Lisp? I remember they are essential for making Graphical User Interfaces
and might be more of a systems facility than a language concept.
IMO, Mozaic is the most elegantly designed purpose-built language I've ever used.
This is open for anyone to comment and share their tips.
Do you have any advice regarding "data structure" in your use of Mozaic? I have considered
putting data in variables in a sequences like a string of 4 or 5 note chords. Or using some delimiter value at the end of a string of elements. Knowing that 1024 limit is always there I reconsidered and just used a variable per chord but then I have to hard code the use of any code. Trade-offs always need to be made.
I can't offer any specific advice on that without more context about what you're trying to store.
If you need to store more data than 1024 numbers, then you need to pre-allocate enough arrays and then maintain two counters. One for the array to use, and one for the position in that array. Then you need an if/else/endif block to tuck the data into the right array.
(You mention strings and delimiters. There is no string variable storage in Mozaic, only numbers.)
That's a pretty good answer. You indicate that there's not another way to store data creatively.
Here's what I was considering. Let's say I have a variable describing chord types:
Major gets coded as 0, 4, 7 (half-steps above the root)
Minor is 0, 3, 7
Dominant 7th 0, 4, 7, 10
I can "string" there numbers into a single array and separate them with a delimiter
Chords = [0, 4, 7, 99, 0, 3, 7, 99, 0, 4, 7, 10, 99]
Then the code can pull out the 1st, 2nd or 3 chord by counting those delimiters. Using delimiters will allow for extra notes to be added to the "Chord" array.
Packing extra content into a data structure with these delimiters is a type of data encoding.
It can effectively allow me to encode arrays so they have more dimensions than a single index.
I'm working on a chord making script and haven't decided which approach to release.
Major = [0,4,7]
Minor = [0,3,7]
which makes me use "Case statement" like ladders of if statement tests:
With an encoded chord map I can use that "Chord variable" in a loop and process
the correct code and avoid long lists of "if" tests. Trade offs. But knowing what's going on
would not be obvious to someone reading the code. If it runs... I'm happy. @_Ki has followed
my intention and made much more elegant versions of a nasty Streambyter script I posted.
Well, it’s semantics only, but you’re actually not counting delimiters, you’re referencing positions in the array.
(BTW, you can store negative numbers in an array as well. I would normally use -1 rather than 99 in the example above.)
For interest sake, I found an easy way to construct regular chords is to quantize every third note above the root to the chosen scale. This is what I used to construct chords of any degree (“notes” = degree in the snippet below from the Chordulator script).
@Mcd And i even finished a even more complicated version of the One Finger Orchestra, which i didn't publish as playing with it did not inspire me.
And yes, you can do that kind of packed storage also in Mozaic, it works the same as in the StreamByter examples we exchanged back then. As you mentioned, you need a delimiter for the chords.
From that chord array i compiled a ref indexing into the chord definitions to each start of a chord .
Using the ref array it is easy to pick out the chords from the packed storage.
@wim IIRC i used 0xFF as delimiter in my streambyter code. Using -1 also works and is easy to test ( < 0)
In the end i had defined 37 different chord types to pick from, all stored in a single array using the packed form.
Like in the OneFingerOrchestra script of @McD , one could pick a chord per base note, this script offered 4 of these base-settings. All chords could be 'inverted/extended' in 6 types, strummed in sync with variable timing and arped in 8 pattern. All controlled by 4 CCs (chordset, inversion, strum speed, arp pattern) since at that time StreamByter didn't have UI support.
Worked very well, but it sounded so mechanical and not musical as i expected. Automation produced generative pieces, but it didn't bring more fun as the first experiments with @McDs script.
Your idea of using the Mozaic quatizer is nice, thanks for the tip.
I hope this 'boffin talk' doesn't scare off the other readers of this thread
Ahh. I misunderstood. I thought when @McD referred to “delimiters” he was referring to the commas between the numbers in the array. Got it. Sorry for the confusion.
The other way around, I find it highly inspirational, and I'm still only at "reading up on things" phase.
What I wouldn’t give for multi-dimension arrays.
Sure you can do the same with a packed array with the first elements being offsets but... why?
Multi dimensional arrays would be great to have, but seem like they’d be tough to implement with the type of memory and realtime processing constraints an AU plugin like this has to deal with.
One useful encoding scheme that just hit me would be to use:
indexes 0-9 for notes for chord voicing #1
indexes 10-19 for chord #2
to a max of 1020-1024 for over 100+ possible chord voicings.
I have this other array that maps a chord type to a given note in the chromatic scale
from 0-12. That might save having to code a huge list of "if tests" for chord type.
I was hoping someone would help provide some insight and this did help.
@McD You are right - I had to uses the packed format in StreamByter due to the 256 element restriction of arrays. My 37 chords were spread over two of them, so the stored ref added an offset of 256 to indicate chords in the second array.
For Mozaic with it's longer arrays, you can use a fixed offset, which has to be longer than the size of the longest chord plus one - since you still need to store a chord-end delimiter if you want to support for variable length chords.
Initialization is easy:
If you are not sure about the size of the longest chord, you could use the safer initialization, that allows to change the _offset with a single line:
In an earlier version of the mentioned StreamByter Chorder script, i used a fixed offset of 8 which was enough for chords with up to 7 notes. But the longest chord i wanted to store had 9 notes, so i had to change to packed form due to max-array size limitations.
What I'm playing with are creating 10 notes voicings and giving the user a knob to control how many of the 10 notes are used from 2-10. Every chord type could have up to 10 notes.
Then I added a delay knob to strum and arppegiate the (up to) 10 notes and then I added... you get the idea.
I'm creating a bunch of knobs for various parameters you can change including completely different sets of chord maps for classical, jazz, power chords, quartal structures, etc. I also have a "root" change which makes it
apply the harmony around a selected root (tone center).
I want to add playing the chords and arp'ed notes from the DAW's BPM... either as a variation of the script or another row of knobs, I think. Maybe the arp's could play over the chords from the "one finger" version. I want it to have different features from @wim's excellent Chordulator script. But it's still
my journey to understanding how Mozaic works.
@McD Sounds cool 👍🏻 and is picking up the ideas we discussed with the StreamByter script.
I also have an initial Mozaic version of a SmartChorder script based on the old ideas, but didn't work on that for the last half year. Please finish yours, so i don't have to finish mine
As i am currently writing an In-Order Arp script (requested in other thread) , i came up with the following idea for synced arps:
When using a metropulse of ppqn, there are ppqn * HostBeatsPerMeasure pluses per bar. A 1/8 note will play for pulsePerBar / 8 which leads to divisions = Round (pulsePerBar / arpTiming )
A gate-length of lenPercent % needs noteEndPulse = RoundDown ( divisions * lenPercent / 100 ) pulses
Thats all the precomputation needed to run the following @OnMetroPulse event to generate the synced, arped notes:
More tricks: Triplets need divisions = Round ( pulsePerBar / arpTiming * 2 / 3) and fit nicely into a bar. To support dotted arps, i had to modify the base idea as their 3/2 factor results in 3 bars for repetition breaking the _div calculation. The modified version
_div = (CurrentMetroPulse + ( (HostBar-arpStartBar)%3) * pulsePerBar)
also works for triplets and regular notes. arpStartBar = HostBar is stored when the first note for the arp chord arrives.
And the last hint: The above method only arps when the Host is playing, but i extended the idea into a user-event (with pPulse and pBar parameters) that can also be called from @OnTimer used when the host is not running. The @OnTimer generates the pPulse and pBar by counting pulses, @OnMetroPulse supplies CurrentMetroPulse and HostBar.
If no unexpected distractions or problems arise, the In-Order Arp script will be published after the weekend . The core functionality is already working and GUI layout is done.
That will be a nice functionality to add to the Mozaic tool chest. And thank you for sharing the details on BPM note releases. I was having fun with various PPQN settings from 1-8 and then running 2 with polyrhythms like 2:3 or 3:4. You can generate some complex textures between 2 instruments.
Sample test drive of my Mozaic script in it's current form. I can take a single MIDI note and convert it to:
1: large chordal voicing - suitable for orchestral ensembles
2: apply a configurable delay and arpeggiate the chord voices - suitable for pianos, harps
I started up 2 copies and fed one to some strings and a synth and the 2nd to a piano and harp:
Wow, I loved that! Looking forward to this one!