Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Video sequencer

There are many tools to create visuals from music, but are there tools that create music from visuals?

What i like to see:

An app or AU where you can load a video. Now you can place markers in the video window. Those are trigger points to set an action. Every pixel on a video screen contains data, like the RGB value.
I think you can do a lot with the data, because it has a set of 3 values. For example, you can convert the values into MIDI-data.
When a clip is playing, every video pixel is constantly changing value. that would result in a constant MIDI-data stream on every trigger point. Solution: Set a threshold to a data value.
Also nice: You can set an LFO to each trigger point (x/y value), so your trigger points move on the screen.

This would be a cool MIDI-sequencer, and very experimental...

Comments

  • Neat idea. I think for it to be musical (ish, anyway), the app would have to have a very nice/flexible/intelligent/etc quantizer of sorts. A single frame of 4k video has over 8 million pixels! That would choke any MIDI system before it even started.

  • edited June 8

    Another type of sequencer:
    A square picture of a black/white pattern.
    It could be any Pattern (Abstract, hand drawn, picture posterised to black/white)
    Now you can place trigger points somewhere on the pattern.
    Let the trigger points move with LFO's
    Each trigger point can have a note value.
    As soon as a trigger point "hits" the white --> Note On.
    When a trigger point "leaves" the white --> Note Off.

    This would be an advanced version of Rosetta Collider :)

  • edited June 8

    @syrupcore said:
    Neat idea. I think for it to be musical (ish, anyway), the app would have to have a very nice/flexible/intelligent/etc quantizer of sorts. A single frame of 4k video has over 8 million pixels! That would choke any MIDI system before it even started.

    A large video format could be downsized to a small format, like a couple hundred pixels in with or height. The goal is to set a couple of trigger points on the video window, so you sample a few chosen pixels.

  • edited June 8

    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

  • edited June 8

    @waynerowand said:
    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

    I came across Pixel Nodes a year ago. I think, its not very stable, but a fun app to play with.
    https://apps.apple.com/us/app/pixel-nodes/id1313351782

  • @Identor said:

    @waynerowand said:
    Slight aside, anyone know of any video synthesis on iOS? Is something along these lines even possible?

    I came across Pixel Nodes a year ago. I think, its not very stable, but a fun app to play with.
    https://apps.apple.com/us/app/pixel-nodes/id1313351782

    Thank you! The feedback loop looks really fun.

  • I’m pretty sure some of these could be tried out/prototyped in max/msp, maybe even pure data, which I think is free/open source... I think you can get a fairly decent length demo of max as well, but it’s not that cheap to buy a full licence of if you get hooked...

  • Another Program for (interactive) visuals is Vuo
    https://vuo.org

  • There’s sort of the still-frame equivalent: https://warmplace.ru/soft/phonopaper/
    And also https://warmplace.ru/soft/pixivisor/

  • Alexander Zolotov also has an app that generates sound from the live input of your camera.
    The app uses the spectral synthesis algorithm of the Virtual ANS engine, another app of him.
    It’s on old app. Doug at The Sound Test Room Tested it in 2014. Here’s the YouTube video:

    And here’s the link to the AppStore:

    https://apps.apple.com/app/nature-oscillator/id906588504

Sign In or Register to comment.