Audiobus: Use your music apps together.
What is Audiobus? — Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.Download on the App Store
Audiobus is the app that makes the rest of your setup better.
Why does the mastering engineer do the ceiling and not the codec?
Very technical topic, beware!
This is something I've always wondered since the whole "Ceiling" thing came about when music increasingly got spread via lossy formats / streaming etc.
Why the heck is the mastering engineer supposed to set the "Ceiling" to prevent any codecs (well, encoders) further down the line from doing stupid stuff, thereby needlessly reducing the output format's dynamic range (by a tiny amount, I know, but still!)
Should't each individual codec know better how it works internally and thus do the necessary level reduction, always assuming that the input is normalized to 0 dBFS?
Or am I again totally incompetent?