Well, yes and no. Users dress down Apple for training its machine learning algorithms on their intimate photos. August 13th 20K shares. Come on, if you're going to recognise, identify and categorise people's nudes and completely creep them out in the process - at least do it fairly. Some iPhone users are feeling exposed by a feature in their phones' Photos app. Since then there have been huge numbers of other users who did not know about the feature taking to social media to warn their friends. Apple pre-selected the term "brassiere" to be available on the Photos app.
Don't show this again.
Apple Can See All Your Nudes And They Have Their Own Special Folder
Brassiere is one of those search terms. In an era of changing norms around privacy and the internet, it's times like this when people begin to question whether our devices know us too well, and whether this is really the kind of world we want to live in. The first realisation that Apple had been categorizing pictures that were perhaps better left uncategorized and left never to see the light of day again came on Twitter in October But it seems that the AI hasn't quite got the "brassiere" theme nailed just yet - many users have found that perfectly innocent photos of themselves in vest tops, or ones revealing just the tiniest bit of cleavage, have been added to the album. Pixabay While people online are generally creeped out by Apple's antics, their policy of categorizing images local to each device is concerningly different to how Google do things. But it has opened up a wider debate on exactly how much technology has come on to be able to recognize your pictures in such detail - and to what extent that tech could allow companies to tap into every little aspect your lives.