Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Learn to Program the Mozaic Workshop to create MIDI FX and Controllers *you could learn something*

1234568»

Comments

  • heshes
    edited April 2020

    @McD said:
    . . . Ideally you should only have to edit variables in the @OnLoad section at various stages and I'll always make an AUM project available since it saves everyone a lot of extra "text processing" that comes with coding.

    I've been fooling around with extending your script to use three different knobs to:
    (knob 1) switch between any of 16 different scales,
    (knob 2) switch root to any of 12 base notes, and
    (knob 3) switch octaves,
    and (4) show notes being played on flashing pads,
    . . . hopefully all working to change settings in real time as notes are being played.

    I can get it tweaked a bit and post the code if you're interested. Not sure, because it seems like you want the pleasure of working through this stuff yourself. In any case, those additions seem to be ones that make sense. Another one might be a knob to control how many notes of the melody get cycled through (length variable?).

    Ideally, it would be nice to be able to write and/or tweak the melody line itself in the gui, but Mozaic doesn't have a great interface for that. Still there are things that could be done, which would include modifying the melody line itself while it's playing.

  • McDMcD
    edited April 2020

    @hes said:
    I've been fooling around with extending your script to use three different knobs to:
    (knob 1) switch between any of 16 different scales,
    (knob 2) switch root to any of 12 base notes, and
    (knob 3) switch octaves,
    and (4) show notes being played on flashing pads,
    . . . hopefully all working to change settings in real time as notes are being played.

    I can get it tweaked a bit and post the code if you're interested.

    @hes - You work on the accordion buttons request was epic!

    Post your solution here. I can use it to comment on the features and will probable base some demo AUM project on it if it's say to use. Then I can show how adding your own sequence notes is really a form on piano roll to create music on a par with the classic notation systems.

    Tied notes can happen when the same note is used in the sequence and the space between taps is less than the delay time.

    Spaces/rests are just a function of leading all notes reach their "delay" termination. But we can sneak "rests" into the pattern by choose one of the extreme edges of the MIDI spectrum between 1 and 128.

    Keyboards have 88 notes but some synths will accept and do something with these corner case inputs often with surprising audio results. I was use "1" for rests and some synths played their lowest note like A0.

    I don't need to own this development effort... it can be a shared project. I
    ll just fork you code or as you say... work through my own solutions. Too many Mozaic scripts seems like a problem but so are too many comments. It floods our finite processors with input. But this thread can be filtered too.

  • Great teamwork on this thread guys.You all are awesome. Looking forward to the development of this helpful tool.

  • McDMcD
    edited April 2020

    @hypnopad said:
    Great teamwork on this thread guys.You all are awesome. Looking forward to the development of this helpful tool.

    Wait until you see what @hes does with this idea... he's good and fast at following a user's input.
    Just keep asking for the features you will use and what you won't put up with to actually use it.

    I'll work on bolting on a Pad-based Sequence editor. I have the basics of it workin in another script. This
    takes the idea away from asking users to open the code editor but I'll keep pointing out that the data is exposed in some essential variables but those variables can be manipulated/tweaked with knobs, and events too.

    Some of the classic MIDI mangling follows Audio ideas: reverse, speed up/down, invert, pitch shift.
    Then we just keep adding more knobs. My scripts to this point have expanded to 22 knobs and multiple On Screen UI's using the shift key. But the code becomes massive and fragile to adding more changes and it becomes work. I prefer in some ways to think of a handful of Objects and I might make a good Object to come before or after this one too like the great MIDI Echo FX app which generates multiples of notes but an app to throw out notes with probabilities and mangling is fun too. Once you get a static pattern going (like the Kraftwerk hypnotic rhythms) it's fun to force subtle changes on it.

    This is where my recent Kat Pad experiments took me when I added keyboard solos back the mix.

  • @McD are you going to feed any of the technical improvements back into glass?

  • @crifytosp said:
    @McD are you going to feed any of the technical improvements back into glass?

    In it's core Glass uses all the ideas... it's a basic sequencer wrapped in 100 updates.
    As you know, I wanted to enable sequence editing but I started to have stability issues that
    wore me out so I set it aside until my energy is up to the task of debugging it for the next big update.

    This is my attempt to allow complexity without embedding it i the code which is why I want simple
    objects that can be plumbed together. Running 6 Pad Objects beats expanding to 6 parallel channels in a single app. But once we add some knobs I will need to make the variables GLOBALS that change in all
    instances with one knob tweak. That also might allow for one app with all the knobs and a bunch of minions that never need to be opened for tweaking.

    I used Glass in this last project by the way or maybe it was "One Finger Orchestra"... they both do that
    Kraftwerk style meditative pattern generations when you hit GO on the DAW.

    I think I'd like to drive the new samples available in NS2 with these apps and start my NS2 journey.
    A good NS2 demo will get some extra attention on Mozaic as a tool for quick complexity. There's also an
    NS2 PatchStorage area while there's no AUM area. So... still looking for good venues to set up the soapbox and start preaching.

  • @hes said:
    I've been fooling around with extending your script to use three different knobs to:
    (knob 1) switch between any of 16 different scales,
    (knob 2) switch root to any of 12 base notes, and
    (knob 3) switch octaves,
    and (4) show notes being played on flashing pads,
    . . . hopefully all working to change settings in real time as notes are being played.

    I can get it tweaked a bit and post the code if you're interested.

    All these ideas would be so helpful. Having a device like this that is easily tweaked on the fly would be a huge step in creating my music workflow in iOS that was before limited to my laptop.

  • McDMcD
    edited April 2020

    Here's a new AUM project with the modified scripts (included) ready to run 6 channels.

    https://www.dropbox.com/sh/pflln5xhsytct2g/AACrBsRTDfDJmTgR0GgGg0nfa?dl=0

    Here the script with the new Root Variable. I also copy Root into GLOBAL01 and check it for every "tapped" input. So change it in any script and the whole project will change keys. If that's not what you want just comment out the line "Root = GLOBAL01".

    I also added an Array that holds the default Notes emitted by the Kat_Pad

    Kat_Pad_Inputs = [63,67,51,38,36,44]

    As each new Instance is started from 1 to 6 the Instance variable is used to set
    the correct "Base_Note" corresponding to the next Pad in sequence in this order:
    Upper_Right = 63
    Upper_Left = 67
    Lower_Right = 51
    Lower_Left = 38
    Bass_Drum = 36
    Hi-Hat = 44

    You can fill this variable with the specific Notes your rig produces too.

    @Description
    
    PO1_v0.1
    
    PAD Object 1
    
    @End
    
    @OnLoad
    
     Instance = 1
     Root = 0
     Sequence = [1,5,8,7,1,5,8,7,5,4,3,2,3,5,7,3]
     Scale = [1,3,4,6,8,10,11,13,15,16,18,20,22,23,25]
     Length = 16
     Octave = 5
     Delay = 100
    
     // Top Right, Top Left, Lower R, Lower L
     // BD and HH pedals are last
     Kat_Pad_Inputs = [63,67,51,38,36,44]
    
     Log{Instance: }, Instance
     Base_Note = Kat_Pad_Inputs[Instance - 1]
     Log{Base_Note: }, Base_Note
     GLOBAL01 = Root
     Log{Root: }, Root
     Loop = 0
     ShowLayout 2
    
    @End
    
    
    @OnMidiInput
    
      Root = GLOBAL01
    
      if MIDICommand = 144 and MIDINote = Base_Note
        if MIDIVelocity = 0
          exit
        endif  
        MIDItype = MIDICommand
        pad_note= MIDINote % 12 + 1
        if loop = Length
          loop = 0
        endif
        Sequence_Note = Sequence[Loop]
        Note = Scale[Sequence_Note - 1]+(Octave*12)-1+Root
        Log {Note: }, Note
        Inc Loop
        SendMIDIOut MIDICommand, Note, MIDIVelocity
        SendMIDIOut MIDICommand, Note, 0, delay
      endif
    
    @End€
    
  • It’s working well. I like the global option.

  • @hypnopad said:
    It’s working well. I like the global option.

    Good news. I'm excited to imagine what you will produce with it and what features you will request to
    improve it. I'm going to make a Chord Object prototype that can be plumbed after the output of a PadObject to have one voice generate chords in the same scale.

    Can anyone think of additional functional objects: probability gates, echos, arp, N-note looper?
    Or better yet... write one and share here with the code. Smaller is best.

    I think I'll use GLOBALS so there's one instance with the knobs and the others never need to be opened in the DAW. There are 99 GLOBALS and each can hold a 1028 element array so a appatern could be 1024 notes long and a knobs can choose the start point and the length of the melodic sequence. Saving a project in AUM saves all the various knob settings chosen in a practice session to make a performance
    set up.

  • @McD - Is MIDI programming of Mozaic the same used in other MIDI apps? I'm working on a project but I dont understand much about MIDI programming.

  • @Samflash3 said:
    @McD - Is MIDI programming of Mozaic the same used in other MIDI apps? I'm working on a project but I dont understand much about MIDI programming.

    MIDI communication (events, data formats) are all standard.

    Programming MIDI is like programming any media type (graphics, images, sounds). It helps to know
    about music to create musical results. Compared to the learning curve for the other media types MIDI is
    great place to start practicing programming.

    Converting any Mozaic idea into Swift/Object C should be a good exercise to figure things out.

    But starting by solving design problems with Mozaic is also a good way to prototype ideas. All the Rozeta FX'es could be prototyped in Mozaic for example but converting to another language can allow you to add a custom User Interface like @Brambos does and that's usually important for most IOS users. The Mozaic limitations just make them crazy. They pick apps apart on the basis of usability. For me it's what it can do you make sounds and I'll use any user interface if the results are interesting. Mononoke broke my brain at first but now I love it because the results are unique by breaking the keyboard metaphor for notes.

  • @McD said:

    @Samflash3 said:
    @McD - Is MIDI programming of Mozaic the same used in other MIDI apps? I'm working on a project but I dont understand much about MIDI programming.

    MIDI communication (events, data formats) are all standard.

    Programming MIDI is like programming any media type (graphics, images, sounds). It helps to know
    about music to create musical results. Compared to the learning curve for the other media types MIDI is
    great place to start practicing programming.

    Converting any Mozaic idea into Swift/Object C should be a good exercise to figure things out.

    But starting by solving design problems with Mozaic is also a good way to prototype ideas. All the Rozeta FX'es could be prototyped in Mozaic for example but converting to another language can allow you to add a custom User Interface like @Brambos does and that's usually important for most IOS users. The Mozaic limitations just make them crazy. They pick apps apart on the basis of usability. For me it's what it can do you make sounds and I'll use any user interface if the results are interesting. Mononoke broke my brain at first but now I love it because the results are unique by breaking the keyboard metaphor for notes.

    You should have written a basic application by now. If not your overthinking the prep stage IMHO. Programmers learn new language all the time just to get another way of thinking about approaching a task. Especially the hundreds of scripting languages that embody different development philosphies (Perl, Python, Ruby, Rust, Scala, etc). I put perl in there just to see if any real programmer's are reading this shite advice. I make some of my most effective application using perl because there are libraries of code for any task you can imagine... mine was converting text processing for an old Business App to
    convert Excel spread sheet data into data entry. A Sales Quote could be converter to Sales Orders and
    Shipment Documents automatically after a an Excel Quote was emailed back with Ordering details.
    I automated a 6 person back office to allow them to just make the quotes and automated all the other data entry. The IT manager stopped me cold because "human entry" adds quality. I quit. Was the data on the quote wrong and need fixing? It was a power play. What thread is this?

  • @McD said:

    @Samflash3 said:
    @McD - Is MIDI programming of Mozaic ...

    Converting any Mozaic idea into Swift/Object C should be a good exercise to figure things out.

    But starting by solving design problems with Mozaic is also a good way to prototype ideas.

    Awesome. Thanks for the response. Here's where I'm kinda stuck...I need it to act as an AUv3 instrument but it seems that only Streambyter can do that. Have you used Streambyter as well? If so, is it different from Mozaic? It seems to follow the same syntax but I'm wondering if I should start with your Mozaic tutorials, then go to Streambyter.

  • @McD said:
    You should have written a basic application by now. If not your overthinking the prep stage IMHO. Programmers learn new language all the time just to get another way of thinking about approaching a task. Especially the hundreds of scripting languages that embody different development philosphies (Perl, Python, Ruby, Rust, Scala, etc).

    Sigh...ok.
    (Actually, you're right)

  • @Samflash3 said:

    @McD said:

    @Samflash3 said:
    @McD - Is MIDI programming of Mozaic ...

    Converting any Mozaic idea into Swift/Object C should be a good exercise to figure things out.

    But starting by solving design problems with Mozaic is also a good way to prototype ideas.

    Awesome. Thanks for the response. Here's where I'm kinda stuck...I need it to act as an AUv3 instrument but it seems that only Streambyter can do that. Have you used Streambyter as well? If so, is it different from Mozaic? It seems to follow the same syntax but I'm wondering if I should start with your Mozaic tutorials, then go to Streambyter.

    Hmm... you making an assumption about those "Instrument, Midi FX, FX AUv3 labels and think SreamByter is somehow meow useful. This is wrong. Streambyter is very powerful but it assumes a level of technical skills that only programmer usually have.

    A Mozaic script can generate notes (which is what an instrument does) so don't let it's slot over on the side fool you. Neither Streambyter or Mozaic include any sound engines. You just plumb it's MIDI into a synth.

    To start use Mozazic... there are some corner cases where STreambyter is more useful like having a Mac version which lets me code scripts on a Mac using Logic Pro synths and move the finished scripts over to IOS. The Mac is a developer dream tool for productive test entry and manipulation. Coding on IOS is like running a marathon with your feet tied together: touch slogging.

    But for the programming command tools Streambyter gets the "sack race" analogy compared to Mozaic's 100 yard dash.

    Coding the Mozaic text on Mac and Air Dropping over to IOS is the best of both. But his assumes you can write a lot of Mozaic without need to test anything like @_Ki can.

    So get unstuck and start this thread at the top and type in the simple scripts... if you have the mind of a burgeoning developer a fire will light. If not. Keep asking questions until someone finds you the right kindling. Programming is a brain process that should keep you up all might at times. Up typing and debugging code.

  • @Samflash3 said:
    Is MIDI programming of Mozaic the same used in other MIDI apps? I'm working on a project but I dont understand much about MIDI programming.

    I thought I'd chime in here so say that, as McD said, midi is midi no matter where it's used, but Mozaic is a fantastic abstraction of the underlying code needed to produce and manipulate midi in an iOS app.

    There's a huge amount of plumbing that has to be carefully constructed to do anything with midi in an app built from scratch. Mozaic takes care of all that plumbing and boils down the essential functions into much more brief and understandible commands.

    That said, Mozaic can be hugely helpful for understanding midi itself so that when you do approach a full-blown programming effort, that part of the learning curve is out of the way. Trying to learn about midi basics while diving right into developing an app is like trying to understand how tomatoes and paprika taste when you're eating a stew. Learning is easier if broken down to individual concepts without having to understand how to make it all work together at first.

    I highly recommend just giving the Mozaic manual a few reads. It has one of the best explanations of midi basics there is right in the first few chapters. When it starts to make sense, then trying some things out takes it to the next level. It's a great way to get one part of the basics you'd need to develop a midi app, without being buried in the giant learning curve that's needed to make a full app.

  • @wim said:
    I highly recommend just giving the Mozaic manual a few reads.

    1 answer: RTFM.

    This thread is the anti-RTFM project. Type in code that works without knowing anything. The essential contest is finding how to get test into the editor and getting it loaded and running it with results.

    Real Programmer's think this method is wasting a lot of time when you can just read the manual and become educated. I think finding and changing working code is a better way to start from zero.

    I learned C using K&R's classic "C Programming Language". C is a bitch of a language to master but this book gets you out of the house and down the path towards Mordor to destroy the ring... Frodo. Or Bilbo if you want to get the ring. Word the ring be good in a time of social distancing? Sure. Get it get out without detection and pick up a few items.

    Anyway... real programmer's do show up here but they rarely get excited about writing dead simple code. Sad. Sometimes dead simple tools are pretty damn useful when glued together using "pipes".

    Choose widely, young Hobbit.

  • @McD said:

    @wim said:
    I highly recommend just giving the Mozaic manual a few reads.

    1 answer: RTFM.

    This thread is the anti-RTFM project. Type in code that works without knowing anything. The essential contest is finding how to get test into the editor and getting it loaded and running it with results.

    Real Programmer's think this method is wasting a lot of time when you can just read the manual and become educated. I think finding and changing working code is a better way to start from zero.

    I share similar thoughts. I read the manual (it Apple guide) for iOS shortcuts but didn't understand a thing. I knew what variables were but I still didn't know how the app works overall. When I watched a few videos, then I edited someone's code, then built something using other codes, it made sense.

    Now I can sorta write a full iOS Shortcuts.

    That said, I plan on looking at both the manual, and the steps listed above. I'm gonna (for now) continue scripting in Mozaic, and Pythonista until I get JUCE competent.

  • @McD said:

    @wim said:
    I highly recommend just giving the Mozaic manual a few reads.

    1 answer: RTFM.

    This thread is the anti-RTFM project. Type in code that works without knowing anything. The essential contest is finding how to get test into the editor and getting it loaded and running it with results.

    Real Programmer's think this method is wasting a lot of time when you can just read the manual and become educated. I think finding and changing working code is a better way to start from zero.

    I learned C using K&R's classic "C Programming Language". C is a bitch of a language to master but this book gets you out of the house and down the path towards Mordor to destroy the ring... Frodo. Or Bilbo if you want to get the ring. Word the ring be good in a time of social distancing? Sure. Get it get out without detection and pick up a few items.

    Anyway... real programmer's do show up here but they rarely get excited about writing dead simple code. Sad. Sometimes dead simple tools are pretty damn useful when glued together using "pipes".

    Choose widely, young Hobbit.

    Pages 21-30 do take the simple code instructional style but I do tend to prefer complete block of
    code that can be pasted into the editor, loaded and run that do something you can hear or see.
    So putting more complete examples would be nice. @Brambos is a pro so he hates wasting extra typing where ever possible. It's a trait of the pros. Writer's like cover exposition to rake multiple paths to getting locked into the the reader's brain.

    NOTE: as you try to understand these simple apps you'll be desperate for a good reference to the language... so it's only a matter of time, RTFM. It's a good manual but it's got very little working code in it. A companion edition with simple working code examples would be a nice contribution for anyone seeking a book writing project. You should write it without knowing too much so you can catch all pit falls that hang up the others on the same journey.

    This thread is a nightmare for the reader to follow and probably why no one has survived from start to here. Even me really. But someday I'll read the thread and convert it into another form... a wiki would be nice. Especially a shared wiki with a lot of input and not so much re-editing someone else's work. I like wiki's that are a maze of content linked into a knowledge "body of work".

  • edited April 2020

    @McD said Can anyone think of additional functional objects: probability gates, echos, arp, N-note looper?

    One thing I like to design in Ableton Live with Max for Live devices and layers of racks is a velocity triggered midi delay/transposer. Have a Mozaic script(s) that is part of a group of others (say the pad object)and is only triggered by high velocities. In this script have a midi delay set with tempo based figures(dotted 1/8 note for example)and a transpose feature.
    Put a couple of these downstream (all with different delay and transposition settings) and you can have a blast triggering cool arpeggios on the fly just with a hard hit on your pad!

  • I have two questions in regard to using Mosaic in AUM. After I tweak settings in the code and upload it will that be saved within AUM when I save the project or do I need to save each tweaked instance of Mozaic separately before I save the project?
    When you create a midi track with multiple Mozaics does the signal flow go from top to bottom?( That is assuming they are routed to the same track). Parallel or sequential I guess is what I’m asking.

  • wimwim
    edited April 2020

    @McD said:
    This thread is the anti-RTFM project. Type in code that works without knowing anything. The essential contest is finding how to get test into the editor and getting it loaded and running it with results.

    Good to know. I'm in the wrong place then. Buh Bye, and best of luck.

  • @wim said:

    @McD said:
    This thread is the anti-RTFM project. Type in code that works without knowing anything. The essential contest is finding how to get test into the editor and getting it loaded and running it with results.

    Good to know. I'm in the wrong place then. Buh Bye, and best of luck.

    I’m sorry you took it that way. You can see that there’s a very small audience. I live for someone to try the basic script examples and ask questions. Motivation is essential... and the student brings that.

  • I didn’t take anything negatively. I just wouldn’t be pushing a compatible approach. There’s no one size fits all approach to learning, so you approaching it differently in this thread is a good thing. 👍🏼

  • @shinyisshiny expressed a request for an App that fits with my thinking of linking objects together.

    @shinyisshiny asks:

    i've searched but havent found exactly what im looking for: an app that you generate chords, play those chords in a rhythmic pattern within the app, and then send that midi out to different synths.

    It's inspiring to know there would be users if we built something that does this.

    I have started worked on the chords Object which combined with the PadObject would take it to the "generate chords" level. A rhythm object would be needed that get's piggy with specific chords coming in but i might make sense to drive the Rhythm object from the PadObject and then puts chords at the end.

    But that's the cool thing with objects... order is up to you.

    So... back to the chord objects.

  • Hey @McD,
    Question about MIDI. Can you actually see the MIDI data coming in from a timeline using Mozaic or streambyter? I’m working on a Garageband project, but I gotta prototype it.

    Decided to put off installing C++ for now, and work on learning Python, MIDI scripting, and JSON to make a functioning prototype.

  • @Samflash3 said:
    Hey @McD,
    Question about MIDI. Can you actually see the MIDI data coming in from a timeline using Mozaic or streambyter? I’m working on a Garageband project, but I gotta prototype it.

    Decided to put off installing C++ for now, and work on learning Python, MIDI scripting, and JSON to make a functioning prototype.

    There are some existing tools that work well for seeing all incoming MIDI data:

    MIDIwrench - free- not AUv3 but good for core MIDI and hardware device monitoring

    MIDISpy - also free (I didn't think it was... works great debugging mozaic AUv3 scripts)

    • Standalone and AUv3 support
    • Basic Filters for Note, Control Change, Clock and SysEx messages
      Connectivity:

    • Inter-App MIDI

    • USB MIDI
    • Bluetooth MIDI
    • Network MIDI
    • AUv3

    MIDIfire includes a MIDI monitoring function and a whole lot more for doing MIDI outside of DAW's. Also runs on a Mac. Not sure about windows...

  • heshes
    edited May 2020

    @McD said:

    @Samflash3 said:
    Hey @McD,
    Question about MIDI. Can you actually see the MIDI data coming in from a timeline using Mozaic or streambyter? I’m working on a Garageband project, but I gotta prototype it.

    There are some existing tools that work well for seeing all incoming MIDI data:

    I'm curious, within AUM, couldn't you just set up an extra channel with a Mozaic instance to do the same thing? Just hook up whatever inputs you want to view (and use no outputs) and have this in the code:

    @OnMidiInput
    
      log MIDIByte1, { }, MIDIByte2, { }, MIDIByte3
    
    @END
    

    Very easy to add filtering or change the output somewhat if you want, maybe a more full blown Mozaic midi monitor would be like this:

    // Mozaic Midi Monitor, just save it and load it in 
    // whenever you want to monitor what's happening in your project
    
    @OnMIDINoteOn
    
      log {NoteOn:  }, MIDIByte1, { }, (NoteName MIDIByte2, 1), { }, MIDIByte3
    
    @END
    
    @OnMIDINoteOff
    
      log {NoteOff: }, MIDIByte1, { }, (NoteName MIDIByte2, 1), { }, MIDIByte3
    
    @END
    
    @OnMIDICC:
    
        log {CC: }, MIDIByte1, { }, MIDIByte2, { }, MIDIByte3
    
    @END
    
    @OnSYSEX:
    ReceiveSysex data
     length = SysexSize
     Log {SYSEX:  The received Sysex data is }, length, { bytes long.}
    // if you want, uncomment next line
    //  log {The Sysex data is }, data
     @END
    
    // WHERE DO MIDI CLOCK COMMANDS GET HANDLED?
    
    // SOME HELPFUL EVENTS YOU COULD LOG:
    
    //@OnHostStart
    //@OnHostStop
    //@OnNewBar
    //@OnNewBeat
    //@OnMetroPulse
    //@OnTimer
    

    I'm not sure if this is just a "poor man's" midi monitor or not. Seems like it might be preferable to some of the dedicated tools; in many cases I think I could probably modify and/or filter with it more quickly than I could using the interface in a graphic midi monitor. Or maybe I'm missing something obvious and it can't easily do something that the dedicated tools do.

  • @hes said:

    @McD said:

    @Samflash3 said:
    Hey @McD,
    Question about MIDI. Can you actually see the MIDI data coming in from a timeline using Mozaic or streambyter? I’m working on a Garageband project, but I gotta prototype it.

    There are some existing tools that work well for seeing all incoming MIDI data:

    I'm curious, within AUM, couldn't you just set up an extra channel with a Mozaic instance to do the same thing? Just hook up whatever inputs you want to view (and use no outputs) and have this in the code:

    @OnMidiInput
      
      log MIDIByte1, { }, MIDIByte2, { }, MIDIByte3
    
    @END
    

    Very easy to add filtering or change the output somewhat if you want, maybe a more full blown Mozaic midi monitor would be like this:

    // Mozaic Midi Monitor, just save it and load it in 
    // whenever you want to monitor what's happening in your project
    
    @OnMIDINoteOn
    
      log {NoteOn:  }, MIDIByte1, { }, (NoteName MIDIByte2, 1), { }, MIDIByte3
    
    @END
    
    @OnMIDINoteOff
    
      log {NoteOff: }, MIDIByte1, { }, (NoteName MIDIByte2, 1), { }, MIDIByte3
    
    @END
    
    @OnMIDICC:
    
        log {CC: }, MIDIByte1, { }, MIDIByte2, { }, MIDIByte3
    
    @END
    
    @OnSYSEX:
    ReceiveSysex data
     length = SysexSize
     Log {SYSEX:  The received Sysex data is }, length, { bytes long.}
    // if you want, uncomment next line
    //  log {The Sysex data is }, data
     @END
    
    // WHERE DO MIDI CLOCK COMMANDS GET HANDLED?
    
    // SOME HELPFUL EVENTS YOU COULD LOG:
    
    //@OnHostStart
    //@OnHostStop
    //@OnNewBar
    //@OnNewBeat
    //@OnMetroPulse
    //@OnTimer
    

    I'm not sure if this is just a "poor man's" midi monitor or not. Seems like it might be preferable to some of the dedicated tools; in many cases I think I could probably modify and/or filter with it more quickly than I could using the interface in a graphic midi monitor. Or maybe I'm missing something obvious and it can't easily do something that the dedicated tools do.

    It's certainly an instructive sample of code that someone learning Mozaic can benefit from:

    1. re-creating from scratch adding extra features
    2. re-typing into the Mozaic (any any text) editor
    3. running in any form in Mozaic and sending it the alphabet song, for example.

    For extra credit they could Log those other "On" events at the end with relevant fortune cookie messages. Then you could use it for fortune telling or reading... it what way are they really different. Probably depends o the ethnicity of the teller/reader. "It is written."
    "See, I told you." One seems to claim authorship while the other is just reading outloud.

Sign In or Register to comment.