Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

How can devs make use of the 3rd cpu core (Air 2) ?

edited November 2014 in General App Discussion

Title says it.I'm just wondering if apple has to"enable" it or if the devs need to update their apps for it.As it is,i have a roundabout 30% cpu boost (coming from Mini Retina) in every app i can watch a CPU meter (Auria,Cubasis etc.).That is in line with the benchmarks for 2 core performance of the Air 2 compared to previous models.But what about the 3rd core?On Mac/PC the devs are (always?) responsible for it but on iOS?I want the power i paid for :)

Comments

  • edited November 2014

    Don't worry, it is being used. There is always a lot of multi-tasking going on, even within audio apps, from disk I/O, waveform calculations, pre-calculated audio processing, the user interface itself, and the main audio pipeline. Given that and all the potential background processing from other apps and the system services, you can rest assured those CPUs are being used for something.

    So even though the critical audio processing path is rarely multi-cpu, any relief that the system can provide to our realtime audio callbacks is a good thing.

  • does that mean devs dont need to update as the ios does the allocation of cpu load?

  • i would say yes, it is system's CPU

  • edited November 2014

    @sonosaurus said:

    Don't worry, it is being used. There is always a lot of multi-tasking going on, even within audio apps, from disk I/O, waveform calculations, pre-calculated audio processing, the user interface itself, and the main audio pipeline. Given that and all the potential background processing from other apps and the system services, you can rest assured those CPUs are being used for something.

    So even though the critical audio processing path is rarely multi-cpu, any relief that the system can provide to our realtime audio callbacks is a good thing.

    Thanks for your answer although i'm not really satisfied with it ;) Auria is a really good stress test/performance meter and of course i force quit ALL other apps in the background (plus soft reset),even switching Wi-Fi off before starting serious work or even testing things.So,there must be way more power available.I just can't believe that the system now is eating up a whole core (or even more) where all other devices (the newer ones like iPhone 5s,Mini Retina,Air 1 and the new phones) handles the same tasks in no real noticeable difference.Yes,it is a little snappier than my Mini Retina in daily usage but the only thing where i notice a big difference is Ram.That is really a jump i'm glad to have now.

  • Don't forget about the 8-core GPU. GPU's can also be used for audio processing according to this series:

    https://itunes.apple.com/us/itunes-u/programming-massively-parallel/id384233322?mt=10

  • Gpu is not ok for audio because of latency

  • edited November 2014

    @Goozoon said:

    Gpu is not ok for audio because of latency

    Thats right but maybe for rendering? this should be work

    (Processing Mastering/Normalize/Trimm....)

Sign In or Register to comment.