Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

VS - Visual Synthesizer by Imaginando Lda

1111214161731

Comments

  • @sinosoidal said:

    @zeropoint said:
    @sinosoidal

    Just a heads up....the same issues with certain material x / y position differences in preview window with tools open vs maximised preview window & maximised preview vs pop out window exist on desktop.....

    Both materials are centred here. But look at the difference in positioning between the maximised preview window (left) and the pop out (right)

    Sometimes, > @Gravitas said:

    @sinosoidal said:

    @Fingolfinzz said:
    @sinosoidal yeah I didn’t word that well. On some video apps, their version of rendering is to hit the record button and then it records and saves that. On others that I have used, the render is like an export of the file that’s present where I load a video and make edits and then export that file as a whole but no option to record, if that clears it up any. Basically I’m asking if we’ll be able to hit a record button to record the performance and then save that as a file

    The way I see it is that we will need to press the button to record because VS relies on midi events that will arrive in real time.

    How about a simple midi note on/off or cc message
    to start and stop the recording?

    That's actually a good idea. A midi mapable record button! :blush:

    😁

  • On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

  • @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

  • Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

  • @wim said:
    Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

    It “listens” to whatever is playing on the device. Play something in music app for example.

  • @NimboStratus said:

    @wim said:
    Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

    It “listens” to whatever is playing on the device. Play something in music app for example.

    I’ve tried playing from the files app, AudioShare, etc. nothing.
    I don’t have anything in music app, and hope to keep it that way.

  • @wim said:

    @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

    If you’re concerned about heat, you might want to not risk this one. While it performs surprisingly well on the Air 2, the heat buildup is significant.

  • @NimboStratus said:

    @wim said:
    Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

    It “listens” to whatever is playing on the device. Play something in music app for example.

    I believe VS is hearing through the microphone! :smile:

    Right now, the only way to pass audio to VS standalone is via microphone, line input or external sound interface input.

  • @wim said:
    Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

    Internally, the way we make the demos work are like this:

    • VS has a song engine, the same song engine used in LK
    • We have a multi channel midi file for each audio demo
    • We load an audio player with the demo audio and the song engine with the respective midi file which will create several tracks sending their own midi in different channels.

    In the future we could allow users to load their own audio/midi files pair.

  • wimwim
    edited June 2021

    @sinosoidal said:

    @wim said:
    Ok, so maybe I’m being dumb. But I figured the standalone app would be able to load in an audio file? If so I haven’t been able to figure out how. Does it not have this ability? If not, then how to get audio into the app if not using the AUv3?

    Internally, the way we make the demos work are like this:

    • VS has a song engine, the same song engine used in LK
    • We have a multi channel midi file for each audio demo
    • We load an audio player with the demo audio and the song engine with the respective midi file which will create several tracks sending their own midi in different channels.

    In the future we could allow users to load their own audio/midi files pair.

    Thanks for those details. That would be very helpful for those of us with lower powered iPads, and also make full screen-screen recording possible.

  • @wim said:

    @sinosoidal said:

    In the future we could allow users to load their own audio/midi files pair.

    Thanks for those details. That would be very helpful for those of us with lower powered iPads, and also make full screen-screen recording possible.

    Yes would be great update!

  • @wim said:

    @wim said:

    @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

    If you’re concerned about heat, you might want to not risk this one. While it performs surprisingly well on the Air 2, the heat buildup is significant.

    Cheers for the info. Yeah heat is a concern, particularly given the age of the device. Not sure I want to risk it.

  • Nothing to add and tbh, I have no idea if I'm ever going to use this but I love that it's out there and I love that I have it for when.

  • Hi @sinosoidal

    The standalone version runs full screen via USB-C —> HDMI but the AUV3 version doesn’t seem to

  • @auxmux said:

    @Jumpercollins said:
    Another app just updated with midi sync is the animation app Looom.

    Anyone used that yet?

    Here is someone using it with a OPZ pre update.

    Could be cool to design a layer to use in VS.

    That's super cool, need to check that out.

    Just discovered midi sync to Looom v1.5 actually works via app on the same device! If you’re using apeMatrix and set midi clock to “always” then you get the midi icon in Looom and it indeed syncs to bpm in apeMatrix. Cool 😎 Haven’t got it to work using AUM though.

    This is just synced bpm though. It doesn’t respond to midi notes/cc or sound frequency like VS does.

    VS is MUCH more flexible and loads more custom control than this one is.

    Looom seems more focused on short animation loops. But, if you can create those loops to be synced to a specific bpm, then import your synced Looom animation loops into VS… that could be a nice combo.

  • @AndyPlankton said:

    @wim said:

    @wim said:

    @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

    If you’re concerned about heat, you might want to not risk this one. While it performs surprisingly well on the Air 2, the heat buildup is significant.

    Cheers for the info. Yeah heat is a concern, particularly given the age of the device. Not sure I want to risk it.

    I’m not worried here, but do keep an eye on it.

  • @skiphunt said:

    @auxmux said:

    @Jumpercollins said:
    Another app just updated with midi sync is the animation app Looom.

    Anyone used that yet?

    Here is someone using it with a OPZ pre update.

    Could be cool to design a layer to use in VS.

    That's super cool, need to check that out.

    Just discovered midi sync to Looom v1.5 actually works via app on the same device! If you’re using apeMatrix and set midi clock to “always” then you get the midi icon in Looom and it indeed syncs to bpm in apeMatrix. Cool 😎 Haven’t got it to work using AUM though.

    This is just synced bpm though. It doesn’t respond to midi notes/cc or sound frequency like VS does.

    VS is MUCH more flexible and loads more custom control than this one is.

    Looom seems more focused on short animation loops. But, if you can create those loops to be synced to a specific bpm, then import your synced Looom animation loops into VS… that could be a nice combo.

    Going to give it a go myself. Looks a neat little animation tool to combo.

    Looking fwd to see where these 2 apps go.

  • @wim said:

    @AndyPlankton said:

    @wim said:

    @wim said:

    @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

    If you’re concerned about heat, you might want to not risk this one. While it performs surprisingly well on the Air 2, the heat buildup is significant.

    Cheers for the info. Yeah heat is a concern, particularly given the age of the device. Not sure I want to risk it.

    I’m not worried here, but do keep an eye on it.

    I have to keep my fans on my laptop cooler running, or else my iPad becomes more like a hot plate 🥵

  • Now an ever more vital part of my set-up

  • Apps for Music Video Creation

    I have several video/audio apps and this is the first AU reactive audio one I have. My favorite of these apps is K Machine because it’s so responsive to audio, I can create or modify GLSL scripts, and output video to HDMI.
    The Shaderific is great for applying GLSL textures to 3D obj object files which can then be incorporated into video clips. Iyan 3D is also great for animating obj objects and 3D text. Forger and Sculptura 3D can be used to create obj files.

    I enjoy Touch ViZ because I can load it up with several banks of short audio clips use effects and switch clips controlled by MIDI and using VJ style workflow. Takete allows me to create an audio visual performance using loops of audio and video arranged in a grid plus it has a nice HDMI output. Glitch Clip is similar to Touch ViZ. Visual Synth is a nice app that allows you to switch to different locations in a video based upon MIDI note input. Looom is great for creating short rhythmic animations especially when you use an svg image as the background layer to mask the additional layers. LumaFusion, Green Screen and Vector Q are great for adding additional effects, layers, text, and images to your music videos.

    Future Features for VS

    1. Ability to create custom GLSL scripts even if they have to be adapted to match with the apps built in parameter controls (you already need to do this with K Machine to get the best results).
    2. HDMI output
    3. Multi-Input where you can control a layer with a specific audio input using Multi-bus Audio Unit Instances in AUM to control the video output of the main VS AU instance. This is similar to using different MIDI channels to control parameters in different layers.
    4. An ability to load small clips in all of the slots and control their start and end points via MIDI (combines some of the functionality of TouchViZ and the Visual Synth apps).
    5. In the standalone app, the ability to load in audio files, video clips, and MIDI files so you can generate the best quality video output without being dependent upon realtime rendering limitations (some of the capabilities of Takete). You could use an AU instance in a DAW to create the video, audio, and MIDI files used for the final render in the standalone app. It would even be nice to have a way to save project files which point to these audio, MIDI, and video clip media resources similar to the way LumaFusion can do so with video, audio, and photo media. Having some sort of timeline/pattern functionality so you could create songs by chaining together different combinations of project presets.
    6. An AU host app that can display the VS AU instance full screen so it can be screen recorded and to specify various dimensions for the output like square or 16:9 ratio for further editing and usage in apps like LumaFusion.
  • edited June 2021

    @The_DMT_Experiment said:

    @wim said:

    @AndyPlankton said:

    @wim said:

    @wim said:

    @AndyPlankton said:
    On and Air 2 what can I realistically expect the performance for this to be..

    Happy to have reduced framerate realtime, export has been promised, but it needs to be workable as a draft format...

    Worth the purchase with future in mind ? Or should I go PC version ? I am not likely to upgrade the iPad any time soon.

    I have an Air 2 and it’s holding up surprisingly well. I’ve not tried more than 4 layers at a time, but it has managed OK even with synths and sequencers feeding it in AUM. My iPad has heated up a bit when driving it hard, not quite as much as Gadget heats it up though 😂.

    I managed to lock up the iPad by pushing it one app too far in Drambo. I had three layers, at least 6 modulations, two AUv3 instruments and three or four FX. It died when I added a Bark Filter on the master.

    So, it has definite limitations, but is FAR better than I expected.

    If you’re concerned about heat, you might want to not risk this one. While it performs surprisingly well on the Air 2, the heat buildup is significant.

    Cheers for the info. Yeah heat is a concern, particularly given the age of the device. Not sure I want to risk it.

    I’m not worried here, but do keep an eye on it.

    I have to keep my fans on my laptop cooler running, or else my iPad becomes more like a hot plate 🥵

    I have ZGameVisualizer within FL Studio, so not tempted enough to risk the heat :) I have a £15 Itunes voucher I received as a Fathers Day gift which is burning a hole in my pocket LOL

    @The_DMT_Experiment That is a beefy looking cooling system :)

    I think I should maybe look into creating shaders etc so I can get more from the tools I already have. Too busy doing other stuff though, work, life, you know :D

  • @sinosoidal - a small app suggestion ... LFO speed with Sync On is kind of fast for my taste. Settings down to 16 or more, rather than 8, would be nice to have.

  • Hey, just some info for people concerned about older iPads over heating. Things run waaaayyy cooler if not using audio input. I just did an extensive session using only midi triggering and envelopes, and the Air 2 stayed cool as a cucumber. In some ways the control is much better that way too.

    Now I’m wishing there were 4 EG’s rather than just 2.

    On the other hand, using FAC Envolver + AU parameter automation would be able to avoid audio control as well, and be essentially unlimited. Off to try that out. B)

  • wimwim
    edited June 2021

    Another killer realization ...
    The background layer isn’t modulatable internally, but the AU parameters for it are. This means you can use external LFOs, envelopes, notes, etc to mess around with even the background layer (if your device can handle it.) B)

  • @wim said:
    Another killer realization ...
    The background layer isn’t modulatable internally, but the AU parameters for it are. This means you can use external LFOs, envelopes, notes, etc to mess around with even the background layer (if your device can handle it.) B)

    I was using the Rosetta lfos to do this, definitely shot the cpu up a nice amount though. I’m also on an air 1 so it doesn’t take much

  • @Paulinko said:
    Apps for Music Video Creation

    1. Multi-Input where you can control a layer with a specific audio input using Multi-bus Audio Unit Instances in AUM to control the video output of the main VS AU instance. This is similar to using different MIDI channels to control parameters in different layers.

    If I understand correctly I think we are already able to achieve something like this by using several instances of FAC Envolver on separate audio tracks in AUM (for instance) and then sending the desired MIDI to separate layers in VS?

  • @wim said:
    Hey, just some info for people concerned about older iPads over heating. Things run waaaayyy cooler if not using audio input. I just did an extensive session using only midi triggering and envelopes, and the Air 2 stayed cool as a cucumber. In some ways the control is much better that way too.

    Now I’m wishing there were 4 EG’s rather than just 2.

    On the other hand, using FAC Envolver + AU parameter automation would be able to avoid audio control as well, and be essentially unlimited. Off to try that out. B)

    Interesting.

  • @Eurikon said:
    @sinosoidal
    I’m not sure if GitHub is the place to report bugs, I posted there.

    I’ll report here as well:
    I can repeat a crash when using VS Mac OS version, while using sidecar on the Mac and having the application dragged to iPad screen.

    As soon as I click on it with an Apple Pencil it crashes.
    Let me know if you need more information.

    @sinosoidal
    maybe this post has slipped your attention, or maybe you were not able to reproduce it?

    I can reproduce it over and over.
    It is also holding me back from buying the Mac OS desktop version.

    Please let me know if you need any further information?

  • edited June 2021

    Very nice results with the latest beta version. The breakout display output in the standalone version works amazingly well, and I was super happy to see the visuals on my PC display’s resolution of 2560x1440 @144Mhz.

    I think I’m finally coming around in understanding how this whole thing works. That little MIDI channel knob is the crucial element that makes all this magic work.

    I’ve discovered that trying to add a single instance of VS to an existing large project in either AUM or Drambo (I haven’t tried Ape Matrix yet!) is wishful thinking, and probably not the best way to go about using VS.

    One will definitely have better results in AUM, by dumping (committing) any large projects (projects with multiple synths, MIDI sequencers, effects, etc) to audio, then starting a new blank project with File players, playing said audio, then placing VS in one of the File Player’s effects slot. Then you can “import” your MIDI sequencers from any project and wire those up to VS.

    Another way would be to build a new project around VS, or with VS in mind at least. This way, you’ll be able to keep an eye on your DSP levels and add/remove elements accordingly.

    I’ve been having great results driving VS with Atom 2 btw. Each instance of Atom 2’s MIDI output channels can be “funneled” into specific channels, with them all wired to VS, then using that little MIDI channel knob to give each “Material” its own MIDI sequence.

    Just thinking out loud here this morning.. really all I wanted to say was… “this thing is freaking awesome!!” 🤔🤪😍

Sign In or Register to comment.