Audiobus: Use your music apps together.

What is Audiobus?Audiobus is an award-winning music app for iPhone and iPad which lets you use your other music apps together. Chain effects on your favourite synth, run the output of apps or Audio Units into an app like GarageBand or Loopy, or select a different audio interface output for each app. Route MIDI between apps — drive a synth from a MIDI sequencer, or add an arpeggiator to your MIDI keyboard — or sync with your external MIDI gear. And control your entire setup from a MIDI controller.

Download on the App Store

Audiobus is the app that makes the rest of your setup better.

Apple scanning images on device and in iCloud

tjatja
edited August 2021 in Other

https://www.apple.com/child-safety/

Again, all people get affected by something that just breaks security and privacy, under the flag of protecting children.
Just because they found a technical way to do this.

It is not the job of a company and service provider to do this.
And it is not even the job of a government to do this - as long as there is not a judge involved who allows the police to do this, for an individual case!

But instead of individuals, now simply any and all content of all people will be scanned.

It is time that companies that sell cabinets and other furniture include some scanning system to their products, to make sure that you don't put child porn into them!

Also, they could send a team to investigate in your flat quarterly.
Just to be sure.

The whole planet is going down the drain with wrong decisions.
I have no idea what to do.

«134

Comments

  • edited August 2021

    Give the multiple layers of user data encryption and privacy protection they built into this and the terrible things they are trying to prevent by implementing it, I’m totally fine with it.

  • edited August 2021

    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

  • @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    The problem is with indiscriminate searching of anything for anything at all.

    What would you think if at every supermarket entrance, you were asked to pull down your underwear so you could be searched for a potential bomb? "But it's just for everyone's safety"...

    The proportion of iCloud photo uploads being child porn (or "CSAM", one more stupid abbreviation to remember) is probably in the same ballparks as bombs in underwear at supermarket entrances.

  • tjatja
    edited August 2021

    @Tarekith and @Hmtx that's exactly how they argue.

    "We try to do good, and for this we need to look into your pockets and smell your underwear".

    You don't seem to understand what is happening here!

    They break privacy and security of and all people!

    And whatever they SAY they are looking for, nobody can prevent them from looking for totally other things too - now or later.

    They look into, watch and scan and judge YOUR most personal data!

    This is a mechanism that can and will be misused - on purpose, by mistake and on accident!
    I can only image what all can and will go wrong.

    And no, the text on their flag "protect children" should definitely not allow any government or company to do so.

  • @tja said:
    "We try to do good, and for this we need to look into your pockets and smell your underwear".

    😄 you literally took the words out of my mouth.

  • tjatja
    edited August 2021

    @SevenSystems said:

    @tja said:
    "We try to do good, and for this we need to look into your pockets and smell your underwear".

    😄 you literally took the words out of my mouth.

    Hehe :smile:

    Right, @SevenSystems
    They do this, because it is easily doable - more easy than to physically search all persons in a shop, or to stop any and all cars at every corner for a search.

    It seems to be required to opt out of most technical gadgets.

    Big Brother is finally there.

    It's a pure horror to me.

  • @Tarekith @Hmtx that is exactly how your 4th amendment rights were abolished after 9/11.

    And of course there’s the issue of false positives. “Oh, that was a photo of your daughter at the beach? Sorry for imprisoning you and ruining your life.”

  • tjatja
    edited August 2021

    It was bad enough that the EU is aiming and implementing for backdoors in chat encryption - also with the argument of protecting children and preventing terrorism.

    But now, companies take the roles of the law, justice and police - all together, not separated as required for governments.

    And the latest information about how easy Face-ID can be hacked (yes, I always was suspecting this and preferred Touch-ID), you need to resort to passwords-only and have everything additionally encrypted.

  • @Philandering_Bastard said:
    @Tarekith @Hmtx that is exactly how your 4th amendment rights were abolished after 9/11.

    And of course there’s the issue of false positives. “Oh, that was a photo of your daughter at the beach? Sorry for imprisoning you and ruining your life.”

    And first step will be, that you account will be closed - yes, with all your purchases and your iCloud data.
    Good luck in getting access back!

    We are just playballs.

  • Nah, you guys are wrong.

    iCloud Photo is an online service to store images. They have a legal obligation to do what they can to ensure they aren’t storing anything illegal. If you expect that company not to review/ scan what they are storing for you… well you are going to have to pay for a service with a much higher price than what Apple charges.

  • tjatja
    edited August 2021

    @Hmtx said:
    Nah, you guys are wrong.

    iCloud Photo is an online service to store images. They have a legal obligation to do what they can to ensure they aren’t storing anything illegal. If you expect that company not to review/ scan what they are storing for you… well you are going to have to pay for a service with a much higher price than what Apple charges.

    No, that is not their job and it should even be illegal!

    The situation is different, when you are sharing content!
    In this case, they may legally be required to make sure that you are not offering illegal content.

    But you can't argue with people who tend to reply "I have nothing to hide".

  • @tja said:

    @Hmtx said:
    Nah, you guys are wrong.

    iCloud Photo is an online service to store images. They have a legal obligation to do what they can to ensure they aren’t storing anything illegal. If you expect that company not to review/ scan what they are storing for you… well you are going to have to pay for a service with a much higher price than what Apple charges.

    No, that is not their job and it should even be illegal!

    The situation is different, when you are sharing content!
    In this case, they may legally be required to make sure that you are not offering illegal content.

    >

    You can share an album in iOS.

  • edited August 2021

    I disagree @tja . Possession of child porn can and should be prosecuted. It’s not about whether you share it.

    And I am not arguing from the “good people have nothing to hide” trope. I have plenty to hide… I will just never put it on the internet or into the hands of another company.

    “ that is not their job and it should even be illegal!” … hmm, Apple can do whatever they please with their services. If they declare to their customers how they are handling photos stored on their servers, what could be considered illegal here? If you don’t like the service go buy another service.

    [EDIT: and I’m bowing out here, sorry I don’t have the brain space to keep arguing an issue many of us clearly have strong opinions about.]

  • I also don't want to fight about this, @Hmtx

    But while I too don't produce or consume child porn, don't plan terroristic attacks or want me to build bombs, I still don't want other to check what I am doing!

    Even if a person does and uses such things, no company or government should be allow to scan the content of any and all users, even if they find that person this way.

    Most countries have a legal system and that should definitely not be broken!

    As long as there is no one bringing you to court and a judge enables people to do so, nobody shall legally look into that person's private data, what they do in their bed, their toilet or elsewhere.

    This seems to be such a simple and basic thing to agree to, for me.

  • @tja said:
    But while I too don't produce or consume child porn, don't plan terroristic attacks or want me to build bombs, I still don't want other to check what I am doing!

    The thing is, IF someone actually wants to produce and distribute child porn, they surely won't be stupid enough to use iCloud!

  • edited August 2021

    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'. It doesn't seem to imply to me that any new picture taken will be looked at by someone at Apple to see if your kids beachside nipple slip is accidental or not. Seems more like a machine going 'does it match any of these pedo pics making the rounds?' sort of thing. Also sounds like a certain threshold of positives need to be hit first. But yah maybe next they will scan for movies, comic books, nuke plans etc, who knows? Maybe they already do.

  • edited August 2021

    @SevenSystems said:

    @tja said:
    But while I too don't produce or consume child porn, don't plan terroristic attacks or want me to build bombs, I still don't want other to check what I am doing!

    The thing is, IF someone actually wants to produce and distribute child porn, they surely won't be stupid enough to use iCloud!

    Oh of course there are stupid sick people... (that is part of the problem). I mean there literally are medically sick people. If the gov is trying to dismantle networks of people (which is how these things likely operate) there will be a whole strata of connections and if one of them is wiggy, off their rocker and storing stuff stupidly then there is a crack to exploit in brining down the operation.

  • @AudioGus said:
    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'.

    That's exactly what Apple has said they are doing, scanning for known images.

  • @Tarekith said:

    @AudioGus said:
    The articles says "new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos."

    What this suggests to me is that there is probably a big old government database somewhere of commonly traded images that are already out there. These images are probably known to often be in the possession of your typical pedo trader folk. So it sounds like Apple will scan it's database for these 'known CSAM images'.

    That's exactly what Apple has said they are doing, scanning for known images.

    Yah just thought I would extract that bit for those who don't read articles before jumping in. ;) I'm feeling smug having read the actual article this time.

  • I kind of see it as Apples way of trying to get ahead of the having to provide back doors to allow authorities to check for child porn. Once they provide a back door it’s just a matter of people finding the key.

  • @BiancaNeve said:
    I kind of see it as Apples way of trying to get ahead of the having to provide back doors to allow authorities to check for child porn. Once they provide a back door it’s just a matter of people finding the key.

    Good point, had not thought of that. Plus the way they describe it sounds like 'dusting for prints'. But at the same time Apple also claim to be selling 'Pro' tablets... stay vigilant folks!

  • https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/

    "Apple says the feature will come first to the United States but it hopes to expand elsewhere eventually."

    We'll have to wait and see how EU responds to this at some point in time.
    But then again we already have more than a few 'nutcases' in the EU Parliament...

  • @Samu said:
    https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/

    "Apple says the feature will come first to the United States but it hopes to expand elsewhere eventually."

    We'll have to wait and see how EU responds to this at some point in time.
    But then again we already have more than a few 'nutcases' in the EU Parliament...

    Good tip! Scan them first Apple.

  • Yeah I'm all for privacy but at the same time I'd like to see who they manage to flush out.

  • @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    This conversation would be more productive if we ignore the people who hijack it with the “child porn is bad, therefore literally anything done to prevent it is justified” argument.

    You’ll never convince those folks that privacy is important, or that it needs protection in order to keep it. Trying to do so will just frustrate you.

    I’m most surprised that the matching appears to be done on your phone, and “safe” images are tagged as such. Technically it makes sense—the on-device image recognition in iOS 15 beta is incredible. It recognizes animal and plant species pretty much instantly.

    I’m uncomfortable with my photos getting scanned like this, but Apple is probably still the least-bad company for online photos. I’ve read that it’s obligatory for photo storage services in the US to run these scans.

    It also seems like this is a way for the authorities to get access to photos when Apple turns on proper end-to-end encryption for iCloud photo library. By having the device run the scan, and then reporting positives to Apple, the entire library, online too, could be encrypted. It’s a clever workaround, but it’s a chilling precedent.

    I will wait to see if this flies in Europe, but I’m considering ditching iCloud Photos. The problem is, it’s a really great service otherwise.

  • edited August 2021

    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

  • How about if we just ignore the people who say privacy is everything and to hell with the rights of children instead?

  • @NeuM said:
    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Agreed except I do not see this in regards to China or Russia.
    I see it happening in regards to the US and the U.K.
    It's a well known fact that we have more surveillance cameras
    than any country in the World except China.
    The only exception to this are the cameras that used to surround
    the Houses of Parliament, those were removed when
    it was attacked by a lone person a few years ago and all the footage
    was conveniently ,'misplaced'.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

    Agreed.

    I'm putting my neck out here.
    I know for a fact that there is abuse within the entertainment industry
    and some of it is being enabled by Law enforcement members here in the U.K.
    I walked away from the industry because of it.
    That was two decades ago.
    Two years ago I provided my audio engineering services for
    an ex police officer/whistleblower who confirmed that police officers
    were involved and complicit.

    The moment that politicians and Law Enforcement agencies have
    to conform to Apple's new implementation then it would be worth the hassle.

    Our government here in the U.K has had a series of articles
    in major media outlets going back at least a decade that stated
    that major politicians, right up to the former prime minister Theresa May,
    were complicit in child porn, abuse et al.
    Apparently there are trials pending but so far nothing has been done.

    When Theresa May was P.M her political party introduced what is
    called a ,'Snoopers Charter', here's an article from Computerworld
    that describes it.

    https://www.computerworld.com/article/3427019/the-snoopers-charter-everything-you-need-to-know-about-the-investigatory-powers-act.html

    There were amendments that were included in the Charter that
    stated that all official data records prior to the Charter in regards
    to politicians could get wiped clean.

    As far as I'm concerned since the Charter was implemented
    all of our communication in the U.K is monitored.

    There are other more unsavoury things happening data wise so
    with Apple permitting Law Enforcement to search through our data
    doesn't bother me,I'm more concerned about the U.K Government.

  • @Gravitas said:

    @NeuM said:
    Catching the worst of the worst among us is a laudable goal, however I see this as opening the doors for countries like China or Russia (or even the US) to demand Apple to monitor for political dissidents or anyone opposed to the regime in power.

    Agreed except I do not see this in regards to China or Russia.
    I see it happening in regards to the US and the U.K.
    It's a well known fact that we have more surveillance cameras
    than any country in the World except China.
    The only exception to this are the cameras that used to surround
    the Houses of Parliament, those were removed when
    it was attacked by a lone person a few years ago and all the footage
    was conveniently ,'misplaced'.

    Frankly, if they wanted to solve problems involving child abuse, they should start with the majority of people who work in Hollywood and the entertainment industry and those who wield unchecked political power (remember Epstein, who was connected to nearly everyone in politics?)

    Agreed.

    I'm putting my neck out here.
    I know for a fact that there is abuse within the entertainment industry
    and some of it is being enabled by Law enforcement members here in the U.K.
    I walked away from the industry because of it.
    That was two decades ago.
    Two years ago I provided my audio engineering services for
    an ex police officer/whistleblower who confirmed that police officers
    were involved and complicit.

    The moment that politicians and Law Enforcement agencies have
    to conform to Apple's new implementation then it would be worth the hassle.

    Our government here in the U.K has had a series of articles
    in major media outlets going back at least a decade that stated
    that major politicians, right up to the former prime minister Theresa May,
    were complicit in child porn, abuse et al.
    Apparently there are trials pending but so far nothing has been done.

    When Theresa May was P.M her political party introduced what is
    called a ,'Snoopers Charter', here's an article from Computerworld
    that describes it.

    https://www.computerworld.com/article/3427019/the-snoopers-charter-everything-you-need-to-know-about-the-investigatory-powers-act.html

    There were amendments that were included in the Charter that
    stated that all official data records prior to the Charter in regards
    to politicians could get wiped clean.

    As far as I'm concerned since the Charter was implemented
    all of our communication in the U.K is monitored.

    There are other more unsavoury things happening data wise so
    with Apple permitting Law Enforcement to search through our data
    doesn't bother me,I'm more concerned about the U.K Government.

    100% in agreement with you. Our government is sinister and aches to become as totalitarian as it can get away with. The past 18 months has given them carte blanche to do as they like.

  • @SevenSystems said:

    @Hmtx said:
    From the article:

    CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

    If anyone has a problem with an online storage provider searching for and detecting child porn in photos being uploaded to their cloud service…. I’m sorry but I see no sane way this could be considered a bad thing.

    The problem is with indiscriminate searching of anything for anything at all.

    What would you think if at every supermarket entrance, you were asked to pull down your underwear so you could be searched for a potential bomb? "But it's just for everyone's safety"...

    The proportion of iCloud photo uploads being child porn (or "CSAM", one more stupid abbreviation to remember) is probably in the same ballparks as bombs in underwear at supermarket entrances.

    I literally stopped shopping because of this and now use freshdirect.

Sign In or Register to comment.