Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Eye Tracking is coming to iOS/iPadOS as accessibility feature!

Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

Comments

  • Huh, will you look at that, amazing.

  • @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

  • @HolyMoses said:

    @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

    I already do that anyway 🤣

  • edited May 15

    @reasOne said:

    @HolyMoses said:

    @kirmesteggno said:

    Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

    Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.

    https://www.apple.com/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

    This could be a game changer for controlling virtual gear when the thing you look at is selected for input. Potentially great for one knob midi controllers and similar!

    Instead of rolling the dials, we will roll our eyes!

    I already do that anyway 🤣

    Haha yeah you won't move knobs or faders with your eyes, just select stuff. Vision pro users seem to like it, if it works it shouldn't stress the eyes beyond of normal screen use.

  • This is no doubt the beginning of minority-report like flat screens from Apple

  • @realdawei said:
    This is no doubt the beginning of minority-report like flat screens from Apple

    Or we’ll all end up being Stephen Hawking.

Sign In or Register to comment.