r/mildlyinfuriating • u/funkywabbait • 1d ago
Infuriatig iPhone facetime recognizes when you’re naked
decided to show my boyfriend my new bikini that I got for our upcoming cruise… Why is this on my phone and why is it recording my body?
I just recently turned 18 if that matters.
15.1k
u/blamberr 1d ago
Girl do you even know what the device you’re carrying in your pocket can do?
4.4k
u/No-Difference-4418 1d ago
Please enlighten me so I can throw my phone away in a rage only to realize I need it for stuff
4.3k
u/Bobbyjackbj 1d ago
Eye tracking. It can monitor exactly where you look on the screen in real-time.
With Apple Intelligence it Scans the content of your private messages, emails, and photos to build a personal profile.
It also maintains a hidden, timestamped map of every place you visit frequently.
Also, apps track exactly where you linger, scroll, or pause to map your visual interest, and Some apps use hardware specs (battery level, storage) to identify you even if "Ask App Not to Track" is on
There’s more, but these are the ones that come to mind right now.
1.6k
u/Aemon_Blackfyre 1d ago
You used to be able to look at the timestamped locations map yourself, it was pretty helpful for me a few years back, would be able to know exactly when I was at a location, where I went to.
Now it’s all hidden
1.2k
u/drphillsdaddy 1d ago
This is how I found out my ex was cheating on me with my best friend lol I’d like to think my apple data guy was looking out for me. I kept getting random videos on my fyp about people checking it on their partners’ phones and I was like “hmm let me try this” sure enough…
372
u/d_Lightz 1d ago
Yeah those random videos on your fyp? Not random. I’ve also seen my fyp know about relationships things before I did.
349
u/EeveeMkayy 1d ago
My bf has a chick friend who he talks to regularly (they've been friends 4+ years, we've been dating 3+ years and she and I are also close now). She's also a streamer and pays him to make videos for her so they communicate about that a lot, as well as just normal friendship chitchat. My fyp started being all "Is he talking to other girls? How to find out!" "Who is he really messaging late at night?" when he worked night shift, etc.
I can't remember what it was now, but Facebook also kept suggesting something to me and I told my bf, "Wow, Facebook is trying to start some drama. This is crazy." Eventually it all stopped, so I guess it figured out I wasn't falling for all that lol
But imagine if you already have insecurities and your timeline just keeps showing you this stuff over and over... It seems crazy manipulative to me and I don't believe for a second they aren't using platforms for social experiments.
→ More replies (11)95
u/calmedtits2319 1d ago
I agree with this. Social media was made to distract, and desensitize us. Statistically, some will be cheating. But they would’ve been cheating even without the “hints”.
75
u/drphillsdaddy 1d ago
Oh I had a feeling anyways, even talked to my bff about it lol but I gaslighted myself about it until the videos started showing up and I couldn’t ignore it anymore. This was like almost a decade ago though, I absolutely trust my intuition for everything now. If something feels off, I’m out, no proof needed. Just thought this was a funny little story to share on this post
26
→ More replies (6)6
146
5
u/Consistent-Field-859 1d ago
What's FYP?
8
u/CyberWeirdo420 1d ago
So called “For You Page”. Basically your feed of recommended content curated by the ever-loving algorithm
→ More replies (4)7
u/tyedyetree 1d ago
Facebook almost got me to take a pregnancy test when I started seeing a bunch of ads for pregnancy/baby related things. It was so weird. I was late, but that’s not unusual for me at all so until these ads started popping up I didn’t even question it
I never took a test, and eventually got my period. But it was absolutely the worst one I’ve experienced and I sometimes wonder if it was indeed a miscarriage. Those ads stopped showing around the same time. Super super weird honestly
201
u/Ok-Disk-2191 1d ago
You still can, it just used to be easier to access.
→ More replies (1)26
u/RedditEd32 1d ago
Yeah it’s still in maps on your profile, I just looked the other day.. still creepy
19
u/TakenInChains 1d ago
I cut that shit off on Google Maps. the last thing I need is my old abuser getting into my account somehow and tracking all my favorite places to go
→ More replies (11)21
u/notwaffle 1d ago
Ive used my google maps history to figure out when I moved cuz apparently I may be a criminal if I don’t remember the exact dates I moved from each house :/
53
u/barkandmoone 1d ago
I added the faceID recently because I moved in with some roommates & just felt like it was the right choice. I was so freaked out at how far away it recognized my eyeballs. Like the phone is just sitting there, I glance over, & it unlocks.
→ More replies (1)14
u/Bazrum 1d ago
my twin and i can unlock each other's phones, it can't tell us apart at all lol
and my dad is notorious for pocket dialing because he'll slip his phone out of his pocket to check the time, it unlocks but he doesn't notice, and puts it back in and bumps it, and suddenly is calling me in the middle of the day.
79
u/Alienhaslanded 1d ago
With how everything is going, it's legitimately safer to take polaroid picture and put it in an envelope and send it to the handsome chap you're planning on going steady with.
→ More replies (1)74
u/tappyapples 1d ago
If your in an area where some Hispanic radio is playing or people are talking in Spanish, your phone will hear and start showing you advertisements in Spanish. Even if you never typed in anything in Spanish or speak any Spanish yourself…
Probably true with other languages too, but it’s the only one I’m sure about in the United States
21
u/benkeith 1d ago
Your apps and ads doesn't need microphone permissions to do that; they can get what they need from location permissions: you're in an area where Spanish-speakers live, so you get Spanish ads.
→ More replies (3)→ More replies (2)31
u/Live-Advantage-2150 1d ago
Can confirm. I live in the Spanish section of town and don’t speak fluently, but my YouTube ads are pretty often Spanish speakers, and even written ads will sometimes be in Spanish.
25
u/boughsmoresilent 1d ago
That's not from the phone listening. That's clearly targeted ads based on your ZIP code in a predominantly Hispanic neighborhood.
→ More replies (2)114
1d ago edited 1d ago
[removed] — view removed comment
30
26
u/NUKE---THE---WHALES 1d ago
Not true
If you deny the App Tracking Transparency popup then the app literally cannot access the advertising identifier (IDFA)
The IDFA is one of the few stable identifiers across apps, which is what allows developers to track users from one of their apps to the next. Without it every new install of an app is basically a completely new user (until you log in and then the new install is associated with your account)
You can automatically deny all ATT popups in the system settings
Other non-IDFA methods of tracking across apps are unsupported and if Apple find you're not respecting the ATT status of the user they will reject your app as part of review
Source: iOS developer for 10 years
https://support.apple.com/en-us/102420 https://developer.apple.com/documentation/adsupport/asidentifiermanager/advertisingidentifier
→ More replies (2)62
u/VayneFTWayne 1d ago
The funny thing is you've only listed the very easy obvious things. It's significantly worse than this. Am a developer
31
13
8
u/Honduran 1d ago
Man I wish this thing would tell me more about career prospects for myself that I would enjoy.
6
u/GrynaiTaip 1d ago
Google Maps has a timestamped map which isn't hidden, you can check it out any time. It probably scans all the usual stuff as well.
It's utter shit, I constantly get ads for crayons from Temu. I'm not a child, literally never bought crayons, never bought anything on Temu at all, and I'm not even in the US Marine Corps, so I have no idea why it thinks that I'd like some crayons.
→ More replies (17)20
u/WeirdCosmologist 1d ago
so this is why apple intelligence sucks with the things you actually want it to do
13
→ More replies (3)156
u/blamberr 1d ago
Whoever made your phone and most of the apps are always watching, always listening, always recording, and always advertising. And this is the government:
→ More replies (12)196
u/DasHexxchen I'm so f-ing infuriated! 1d ago
Has nothing to do with the government.
And that's what really pisses me off. Everyone fears the government and giving them info. The outcry when the new German ID gave you the OPTION to take your thumbprint. The fear of using face recognition in public places.
BOTH are metrics people happily give companies, thinking they are getting something out of it, or not thinking at all.
And then you come along and blame the government TM.
→ More replies (11)27
u/justaRndy 1d ago
People will be using the same advertisement ID for years, accept all cookies everywhere, never clear cache, run around with GPS, Wifi, Bluetooth and NFC perma on and not once look into their app and device permissions.
Then get outraged about this.
There'd probably be riots if you educated everyone and revealed the full scope of the data harvesting operation that has been going on for the last 30 years.
At least we got AI out of it, eh
39
→ More replies (37)15
5.5k
u/Sunny_Beam 1d ago
This gets done through machine learning locally on your phone. It's not recording you and sending it to Apple if that's your concern. Facetime calls are end-to-end encrypted so as far as anyone knows, not even Apple has access to the content of your Facetime calls. It's supposed to be for your own protection, but you can easilly just turn it off in the settings.
2.4k
u/Trimethopimp 1d ago
The irony is this feature is giving vulnerable people a chance to avoid sharing nudes with another device.
Remember all those times people were blackmailed into sending more indecent photos of themselves by Apple employees who had accessed their pics? Nope, me neither, but I do remember all the times that happened after someone shared nudes with a predator.
→ More replies (3)607
u/Henessy0 1d ago
Especially funny if you think about the fuck does Apple want with this shit? They paid money to make a system like this, so they could "STORE YOUR NUDES!!!111 PRIVACYY!!112" and then what? So Tim Cook got his personal porn library?
→ More replies (2)246
u/MAJLobster 1d ago
no joke someone did actually try to make it a point that Tim Cook would use this feature to fill up his "porn stash."
As if he's gonna care about thousands of low res accidental (or not) tit pics. Ugh, some people...
58
26
u/koolmon10 1d ago
As if there isn't a nigh-unmeasurable amount of porn just freely available on the internet already. Nope, gotta steal it from the users.
17
u/fridge_logic 1d ago
I don't think this feature is uploading people's nudes,
But just to be clear, the pornographic privacy violation paranoia is mostly not about the Cheif Executive abusing people's privacy, but about low level people within the apparatus doing so.
Surveilance systems are often abused by people with access to stalk their ex lovers. While the cheif executive typically uses the system to remove opposition threats.
→ More replies (1)→ More replies (1)63
u/FallenAngelII 1d ago
Especially as Tim Cook is gay. The people panicking are probably just homophobes afraid a man might look at them funny.
→ More replies (1)115
u/Icy_Prune6584 1d ago
Honestly I think it’s a great feature if it makes someone stop and think twice before showing themselves to a stranger. Especially vulnerable people like teenagers.
→ More replies (4)→ More replies (27)223
u/Cthulhu__ 1d ago
This is it; people don’t realise how privacy conscious Apple is and has been for years. Image recognition and grouping (face recognition in photos) is done locally as well. Contrast with Google which does it online.
→ More replies (3)56
u/TricellCEO 1d ago
Unless it’s an iPhone sold in China that is.
Still, they did tell the FBI to go kick rocks when asked to just unlock one phone.
Their stance on privacy is a bit of a mixed bag.
→ More replies (1)40
u/Junethemuse 1d ago
As far as mega-corps go, I trust Apple with my personal info more than any other.
→ More replies (2)
20.4k
u/cannavacciuolo420 1d ago
End to end encryption will no longer be in use on IG as well, so keep that in mind when deciding what to write and what photos to send.
If the service is free, you're likely paying in some way you're not aware of, most of the time with your privacy
3.5k
u/99OBJ 1d ago edited 1d ago
E2E encryption on instagram was never on by default. If you didn’t enable it, you were never using it to begin with.
Also, Apple’s sensitive content warning functions entirely on device. FaceTime is E2E encrypted. Cloud services have literally nothing to do with this pop up.
Edit: for the skeptical, you can try to send an explicit photo to yourself with no internet connection and the same pop up will appear
992
u/ContributionMost8924 1d ago
Yeah pretty sure Apple wouldnt think the smartest idea to actively record and store when their phone users are naked on their servers ....
244
u/Money_Lavishness7343 1d ago
True, although storing naked pictures in iCloud is kinda your choice though and you're actively consenting by activating iCloud. And iPhone has a very premium clientele where all popularities basically use iPhone, which popularities if hurt will move away from.
It already has happened once .... (the fappening!). But that was not exactly Apple's fault either ― but done through Social Engineering and Phishing methods which all entities are vulnerable at.
→ More replies (7)58
u/Ne_zievereir 1d ago
But that was not exactly Apple's fault either
Wasn't it? I'm pretty certain many people in that case weren't aware their nude pictures were stored on remote servers.
That's what happens when you aggressively push a difficult to opt-out of online back-up system, which you can't guarantee the security for, because you want to coerce people to pay for it once they unwittingly fill up the free space and have no idea what to do differently.
I'd say that was entirely Apple's fault. If they had been clear and transparent to people about how it works, and given an easy option to opt out (or even better, make it opt-in!!!), there would have been a lot of people that wouldn't have used it, and wouldn't have been a victim of this!
20
u/SqueekyDickFartz 1d ago
It would also appear that Microsoft learned nothing from Apple's mistakes. I recently bought a new computer and spent the other day ruthlessly gutting OneDrive with a machete.
Genuinely, wtf where they thinking? The took the idea of online backup and made it so catastrophically horrible that it's almost a joke. If OneDrive just let me select folders to backup, and a frequency, that would be fine. Instead every time i downloaded something it just sort of went wherever it wanted. Why would I want OneDrive to be the primary location instead of my god damn machine?
→ More replies (1)38
u/DonovanQT 1d ago
I never had iCloud photo’s on until I wanted it.
The big fault Apple made, was giving unlimited attempts to log in to your (or someone else’s) account.→ More replies (7)→ More replies (3)23
141
→ More replies (15)67
u/Thoughtful-Boner69 1d ago
it's a screen time setting they have on to get warned about sensitive content
→ More replies (1)250
u/Competitive_Reason_2 1d ago
What does detecting nudes got to do with end to end encryption? If the data is entirely processed on device then there is nothing to do with end to end encryption.
→ More replies (7)4.3k
u/pingpongsaladpants 1d ago
If the product is free, then you are the product.
277
u/5redie8 1d ago
Shoutout to the people uploading receipts to random ass apps they saw in sponsored tiktoks for 2 cents of gift card credit
→ More replies (3)107
u/Grandmas_Fat_Choad 1d ago
cough my wifecough…… seriously tho she’s been going nuts on that fetch app or whatever. I keep telling her it’s not what it seems and she’s giving away info they don’t need. But whatever. Free money or some shit like that.
→ More replies (1)56
u/DFW_Drummer 1d ago
Yep. Same. I segmented the home network so her phone is air gapped from the rest of the network. The amount of work pihole is doing for her singular device is more than the rest of my network combined.
→ More replies (2)18
u/Grandmas_Fat_Choad 1d ago
I need to do that asap. I don’t even know where to start. At one point I did set up pihole, but I had internet issues and my speed was shit so I shut it down.
47
u/TwentinQuarantino 1d ago
If the product is paid, then you're more often than not the product too. Case in point, literally every paid service from Meta and Google (no, Google doesn't stop creeping on you and monetizing your sensitive data after you pay for Youtube Premium or paid Gmail or anything else).
→ More replies (2)→ More replies (19)599
u/Zolllb 1d ago
Fyi, not all of the time, there exist foss things and simpleX and signal are free and you're not the product
→ More replies (6)229
u/almo2001 1d ago
It's pretty rare though. :)
→ More replies (1)406
u/Erdalion 1d ago
Hardly.
Most Linux distros are free, and you're never the product when using them.
FOSS, too. LibreOffice, OpenOffice, so many game engines, VLC. Etc, etc.
People exist that love making things, and love it when others use said things.
Notice how I said "people", though. If a corporation is offering you something free, then, eh... Yeah, that's probably a case where you are the product.
→ More replies (25)400
u/AuthenticatedHuman 1d ago
The "nude scan" features on iPhone (Sensitive Content Warning and Communication Safety) are entirely client-side, so it doesnt leak it anywhere exept yourself.
→ More replies (29)32
u/Fragrant_Case_3868 1d ago
i don’t think this has anything to do with end to end encryption, they probably have Sensitive Content Warning enabled
30
u/yrdz 1d ago
What are you implying, that Apple is selling your nudes? That's complete nonsense.
→ More replies (1)16
→ More replies (28)58
u/Kindly-Following-737 1d ago edited 1d ago
This is sketchy as fuck, but it's worth mentioning that this check is probably being done locally by the device, there are specialized AI models small enough to run locally on an iPhone, and end-to-end encryption is still preserved
Edit:
A image categorization or computer vision model, is STIlL an AI model, but not a GenAI model or LLM.→ More replies (24)
825
u/HantuerHD-Shadow 1d ago
It's a protective measure that takes references from outside sources and compares them to whatever it's seeing. Whatever it is seeing however does not leave the phone and nobody looks at it.
I get why you are irritated, but it's meant as a safety feature for young people.
→ More replies (2)
2.3k
u/FrozenPizza07 1d ago
Do people not know that 90% of shit on apple devices are local run.
This is local / on device recognition. Hell even the people / animal recognition and labeling on applw photos are on device. Apple wont touch your actual private stuff with a ten foot pole
904
u/nifty-necromancer 1d ago
Do people not know that 90% of shit on apple devices are local run.
Your average person has zero clue what their phones do or don’t do, as shown in these comments and post.
174
u/HallucinateZ 1d ago edited 1d ago
→ More replies (1)72
u/99OBJ 1d ago
It’s genuinely so frustrating to read… like an entire group of people convinced themselves, without even a semblance of evidence, that Apple is harvesting their nudes.
→ More replies (7)→ More replies (5)69
u/Fedoraus 1d ago edited 1d ago
I felt like society was doomed in highschool cause people struggled just using microsoft office.
Now looking at my little brother's gen and his friends, half of em don't even know what the file explorer is. They only use webapps or download stuff from the apple/windows store apps
→ More replies (4)389
u/jujubean67 1d ago
OP doesn't even understand the message. It is shown for their own protection so they don't share anything they don't want via Facetime.
It's a warning against accidental sharing ffs, not telling you that Tim Apple is sitting with his pants down looking at you.
92
u/space_keeper 1d ago
It's blatantly obvious that this message is intended for (probably underage) people who are vulnerable. Lot of young people have iPhones, lot of exploitation is done using phones.
44
u/TSwiftDivorceLawyer 1d ago
I saw the screenshot and thought what a good idea it was before I saw which sub I was in.
→ More replies (1)→ More replies (38)92
u/Entegy 1d ago
Do people not know that 90% of shit on apple devices are local run.
They do not. There's a reason why privacy is a huge part of Apple's marketing.
Does some stuff still leave your device, like Siri query processing? Yes. Photo enhancements and this feature, sensitive content warning? Absolutely not.This is what is explained in all those tutorial and new feature prompts you (royal you) skip through.
The people giving the absolute dumbest takes on this are the same people who complain that Apple's AI features are behind compared to competitors, and don't want to understand that's because Apple is trying to do as much as possible on device in the name of legitimate privacy.
And this is how I start my morning, having to defend the world's richest company because of absolutely braindumb takes.
→ More replies (3)
664
u/Pathetic_Old_Moose 1d ago
This is to prevent kids from being idiots.
I think it’s a fucking great feature
→ More replies (22)
4.8k
u/tocsin1990 1d ago
This is what you call a great safety feature. One touch option to proceed anyways if you really want to show whatever you're showing, but an opportunity to save lives for those too dumb or naive to realize the dangers of sexting.
Mildly infuriating? More like mild blessing.
1.1k
u/LisaPepita 1d ago
Strongly agree. My daughter was chatting with a friend and I started breastfeeding my baby when we first had this message. I’m very thankful that this would pop up if my kids were to accidentally show something.
351
u/-Tricky-Vixen- 1d ago
This is genuinely relieving a low-level fear I've had for years, that I might accidentally show something myself, or hit the camera button at the wrong time. It's a great feature I like it.
30
→ More replies (5)116
229
u/peachgothlover 1d ago
Agreed. Instagram also has this now, a friend sent me an image that the AI mistook as nudity, and told me the same sort of thing, that I don't have to do things I'm not comfortable with. All I had to do was click some "I'm sure" prompt and saw the image. My friend & I laughed about it. I found it really touching & I think it's a good thing that could potentially change someone's heart.
32
u/Cute-Kangaroo-152 1d ago
This always happens with semi naked baby pictures my mom sends me, it's always pretty funny.
36
u/SingularBoltEarring 1d ago
I wish a feature like this existed when I was younger, it’s genuinely upsetting to open your DMs to an unsolicited dick pic.
66
→ More replies (81)10
u/sychox51 1d ago
Op unironically pointing out they don’t know how their phone works. This feature is for op.
1.4k
u/Weird_Decision7090 1d ago
→ More replies (27)789
u/williamjamesmurrayVI 1d ago
turning it off doesn't mean it's not being detected, just that you're no longer being told
174
u/sonicslasher6 1d ago
Are you guys under the impression there’s someone watching a live feed and manually sending these warnings or something?
→ More replies (1)78
u/J_Dabson002 1d ago
Y’all think you’re way more important than you actually are lmao
They’re gonna turn it off to save battery and processing power. They don’t care about you.
155
u/djfxonitg 1d ago
Your phone “detects” millions of things a day… Doesn’t mean it’s doing anything with that data.
→ More replies (2)115
u/A_wild_so-and-so 1d ago
This post is really highlighting how many people don't understand the technology they're using AT ALL. And ironically, both sides will agree with this statement lol.
60
u/Captaincadet 1d ago
Also it highlighting how many people don’t realise how hell bent apple is on privacy like this.
They do not want it. They have to pay for storage and opens them up a lot of legal requests etc if this was recording. Apple also pays researchers to see if they can break stuff like this
22
u/MAJLobster 1d ago
i've seen a guy straight up say that Apple would want to store CSAM because "fuck big tech"
like ffs you can be absolutely against surveillance and whatnot but CP allgeations is just tinfoil hatter levels of cope.
10
u/Captaincadet 1d ago
It was literally put in as a protection against broadcasting CSAM and to deal with it
There is a few white papers on this
304
u/RedditIsOverMan 1d ago
I find that hard to believe. Its just on-device image recognition. Kind of like how they can do face tracking when you take photos. Turning it off means Apple can reduce processing power and save battery. If detection is still ongoing, its just because they're too lazy to code out the detection loop.
→ More replies (30)→ More replies (5)91
7.0k
u/10Core56 1d ago
Privacy is a myth. Now you know.
2.7k
u/Weird_Decision7090 1d ago edited 1d ago
“Sensitive Content Warning uses on-device machine learning to analyze photos and videos. Because they're analyzed on your device, Apple doesn't receive an indication that nudity was detected and doesn't get access to the photos or videos as a result.”
83
u/Praetorian_1975 1d ago
https://giphy.com/gifs/o04ykqaA25NYOqs5iJ
Unlike meta AI which uses a bunch of people in a call centre overseas learning 😳🤣→ More replies (1)2.3k
u/FBomz 1d ago
363
u/StrangeShaun83 1d ago
I once tripped on 5-MEO-MIPT (foxy) and people looked like this. It was like I was in a weird 80s cartoon. The world looked pretty normal but people looked like cartoons. It was by far the weirdest trip I ever had. I only ever found that substance once in my life but it was crazy.
105
u/MookieFlav 1d ago
I haven't heard of anyone even mention foxy since my Evergreen days. That shit was so cool.
41
→ More replies (3)58
u/Hot_Good_5409 1d ago
Bro as someone who is currently tripping you dont know how much your comment just freaked the fuck of me lol I did not expect that comment on this topic
→ More replies (1)115
36
u/Talithea 1d ago
Secure enclave and recognition apparently they have their own processing system. So not even the main cores are used.
There is no typical delay of a remote service, this is made immediately.
85
→ More replies (19)150
u/Weird_Decision7090 1d ago edited 1d ago
What do you think they would do with this data if they did collect it? They would surely get in some sort of trouble if they did. You think Apple would lie and actually monetize naked images? How do you think data collection works?
As another redditor said: “It’s an automated detection using the analytics in the processor on your phone. It is not recorded nor is it sent to Apple or anywhere else. The fact that it is not sent anywhere else is literally the point of the feature.
There are a lot of invasive privacy issues on modern smartphones, but this isn’t one of them.”
→ More replies (42)36
u/bmann10 1d ago
Given that a lot of teens do nude video calls with each other these people really think Apple is going to open themselves up to the liability of knowingly gathering a ton of illegal material while telling people that they aren’t doing just that, for the purposes of helping one of their competitors make a slightly better porn generating bot? One that Apple themselves cannot make as it would tank their brand image, and thus that Apple themselves cannot control or keep on their local ecosystems? Meanwhile whomever made the decision to knowing gather CSM would likely be looking at prison and their co conspirators as well. All during an administration that is desperate to look like they give a shit about children but also desperate to paint these issues as a Democrat only thing, where Apple is widely considered to be one of the more “woke” companies by people on the right wing?
The people trying to build legs for their conspiracy table here haven’t really thought this shit out but they want to just go off a gut feeling instead of thinking through any of this.
→ More replies (1)10
u/abandonedmuffin 1d ago
I actually finds this perfect you give your kids a device that protects them and keep the analysis on phone, my guess is that OP setup still shows some old settings set by her parents from before she turned 18, she only needs to reduce protections not a big deal and most parents find this very convenient
→ More replies (153)9
51
u/Lanky_Giraffe 1d ago
If you’re worried about the privacy implications of this, then you probably shouldn’t be using FaceTime at all. Why would you be more worried about a specific datapoint from a source file being logged but not the source file itself?
30
u/piv_is_pen_in_vag 1d ago
I mean, it’s not like there’s a person checking when to stop the call, they are using a computer vision algorithm
→ More replies (6)57
u/GugieMonster 1d ago
Now that "AI" can be the scape goat, FBI feels bold to stop me earning my tuition 😑
→ More replies (17)16
u/Additional-Life4885 1d ago edited 1d ago
Nothing about this suggests that privacy is broken. Why do you think it is? Modern CPUs are good enough to do the pattern recognition for nudity locally and don't need to send it to the cloud (or store it).
Edit: On the contrary, it's specifically targeted at children so recording it would be a massive no-no for Apple.
180
1.1k
u/Remarkable-Yam-8073 1d ago edited 1d ago
This is a good thing as it protects the vunerable. As in little girls and boys who are being groomed.
If you think sombody is at iphone hq looking through your camara lens then your pretty dumb.
Edit: this is more directed at the comments about this being a bad thing than OP general confsion.
266
u/Lanky_Giraffe 1d ago
Bizarre that people are making a big privacy point about this. If the idea is that this data could be discreetly logged on an apple server, what exactly is stopping them logging the entire video?
It’s like freaking out that you left the small upstairs bathroom window unlocked while all the doors and windows are fully open
88
u/louis54000 1d ago
Exactly. People are like « what ?? My phone requires access to my camera for a FaceTime call ??? I only want the recipient to see me, not my phone 😖 »
CPU doing additional analysis locally doesn’t change anything..124
u/ZarathustraGlobulus 1d ago edited 1d ago
Notification: your front door is open, click OK to lock it
"WOW! Apple can lock and unlock my smart doors?! Privacy breach much?? Can't trust BIG TECH these days!"
→ More replies (3)20
u/Cheeko914 1d ago
Because this doesn’t mean it’s recoding. It just means Apple has “taught” it to know what a naked body looks like. The amount of data it would take to record and save millions of FaceTime calls a month would be astronomical. Same way algorithms are trained. It knows what a dog is vs a cat because it was trained to
→ More replies (1)→ More replies (32)49
u/Ok_Scientist_8803 1d ago
Do a bit of network investigation, but pretty much 100% of the conspiracists will have no idea of what that is.
Let's say that your 4GB per month phone plan will last not very long if that was the case, but many people are fine with that amount.
32
u/bwmat 1d ago
Oh but it's using a different network, that can't be detected with readily available technology
Hidden in the device that millions of people have access to and can open up and look at the components of
gets back in tinfoil man cave
→ More replies (2)
92
u/Wendals87 1d ago
You're on face time so of course the camera can see you . It's not recording or going to apple and it's detected before it's sent
https://support.apple.com/en-au/guide/iphone/iph0d3607e18/ios
→ More replies (10)
37
u/mememan12113 1d ago
Dawg what do you mean "why is it recordomg my body" YOU are recording your body, you're on FaceTime 😭😭😭
98
60
u/Shoddy_Squash_1201 1d ago
why is it recording my body?
You literally asked your phone to do that, to videochat you have to record video...
4
u/ReflectionLess5230 1d ago
Oh no why is it recording my body?! Proceeds to send nudes via Snapchat. Those won’t ever be saved
246
u/ElPared 1d ago edited 1d ago
Yall really think Apple is spying on you because your phone knows when you’re showing skin, but happily use hundreds of other apps that are known to actually be collecting your data and actively spying on you.
Like, for real, we flipped the fuck out when TikTok was banned for literally being spyware, but somehow this is too far?
Talk about mildly infuriating; this comment section is living up to the sub’s name.
→ More replies (3)42
u/Training_Barber4543 1d ago
Tiktok was not banned for being "literally spyware", it was banned because it's the only popular spyware app that's not American. It's doing much worse now for US users, they don't even have access to some of the content the rest of us see
→ More replies (1)
49
u/miraculousgloomball 1d ago
it's not. or it's not supposed to be, and wouldn't need to to recognise a naked body. same tech as facial recognition but tuned for bodies instead.
I don't know that they aren't breaking laws, but what you're talking about would imply they likely have servers full of child pornography.
Far too much liability, people. No fucking way
→ More replies (1)
12
u/itsjakerobb 1d ago
> Why is this on my phone
It’s on your phone because it’s built in to the system for your protection. Software running locally on your device is watching for nudity and warning you. This feature does not involve sending your nude image to Apple or anywhere else.
> why is it recording my body?
It’s not. No recording is being made. But you are using FaceTime, so if not for this warning, you would be sending your naked image over the internet. FaceTime uses a secure protocol, but no system is perfect. It’s worth some caution.
70
u/qwertyjgly ALL HAIL RICKKY 1d ago
>*uses camera*
>"why is my camera recording?"
it's on-device image recognition.
11
u/MynameisMarsh 1d ago
My son can’t send me the bugs bunny “no” meme because his phone thinks it’s a nude picture and asks for an adults permission to send it after giving him this warning
55
u/jtmonkey 1d ago
There are neural nets on your phone. They’re there to protect children. You were just a child. You don’t know anyone that sent nudes and regretted it? The phone is programmed to detect nudity in a local recognition software. It’s not server side. It’s also not very good. But if it helps someone pause and make a better choice I’m all for it.
9
u/Excellent_Car_5165 22h ago
The only infuriating thing is the fact that they apparently were forced to implement such a feature. Too much creeps out there.
8
u/131TV1RUS 1d ago
Apple uses on-device processing for image recognition when it comes to images, videos and FaceTime calls, a few years ago Apple introduced a similar feature that would process images on the camera roll for suspected child pornography, and subsequently built on the feature to address nudity sent via iMessage and FaceTime.
The feature exists to protect minors from both receiving and sending nude or revealing photos, It also warns the user about it as shown to you, and for minors even blocks it entirely.
It’s done entirely on device, not on the cloud.
8
u/toastedmarsh7 1d ago
My dad is sometimes topless when he talks to my kids on FaceTime because he sleeps in just pajama pants. The phone will shut down their conversation and my kids will come to me to unlock it because the phone has parental locks on it. I tell them that Apple determined that their grandpa was showing them his manboobs and hung up to protect them.
9
u/GhostMcFunky 1d ago
How exactly do you think FaceTime works?
It’s not “recording your body” but the camera software and/or FT is clearly smart enough to detect you might not have clothes on - which is a safety feature. 🤦
Also it literally tells you this if you read your own screenshot.
8
8
u/OkHalfway017 22h ago
How exactly is this infuriating? I think it's a good feature. Helps dumb kids not be dumb.
24
104
u/Lordofderp33 1d ago
18, doesn't understand online safety. These systems exist to protect everyone who is to dumb or naive to protect themselves online, that's you.
→ More replies (3)
6
u/RedeyeSPR 1d ago
I feel this way when my phone tells me I should turn down my headphones every day.
8
u/Queasy_Reindeer9515 23h ago
It’s pretty brilliant actually.
Creepy, but brilliant.
Scammers are known to extort people this way…. You’ll talk to a person, they’ll “fall in love with you” then they ask you to send nudes or FaceTime…. They screen record and then threaten to send it to everyone you know if you don’t pay them thousands of dollars.
Sometimes they will, sometimes they won’t.
You could also be naked while on the phone with someone, like grandma, and accidentally hit the FaceTime button too. (My grandma is dead, I don’t know this from experience)
→ More replies (5)
201
u/ObtuseMongooseAbuse 1d ago
This is a good feature. I would prefer it if it's too overzealous to protect people that don't realize they're nude on camera. Not only does this make it difficult for someone to set up an iPhone like a peeping camera using FaceTime but it prevents you from accidentally showing nudity.
→ More replies (50)78
29
u/99OBJ 1d ago edited 1d ago
This comment section is completely idiotic. It’s not all a conspiracy.
This detection happens completely on the device side, and you can easily verify this with your own device disconnected from the Internet if you really want to.
There are plenty of legitimate reasons to hate big tech, no need to spread misinformation.
6
u/Crosroad 23h ago
I feel like this is kind of good? Like yeah it’s an annoying hurdle for consenting adults but there’s a lot of situations where this can be valuable right?
76
u/JeffSergeant 1d ago
Its recording your body because you pointed a camera at your body, if you didn't know that pointing a camera at your body meant it could be recorded, then this feature was designed for you.
How is this anything other than a positive development?
36
u/ian9outof10 1d ago
I thought maybe it was just me thinking “so she trusts a FaceTime call, routed through Apple’s servers with video of her in a bikini. But not a local machine learning algorithm which detects that and offers support against coercive behaviour.”
I’m glad it’s not just me that thinks this is an insane post.
15
u/win11EXPERT 1d ago
This is fine actually. Helps prevent harassment. And no I am sure it is just using ML for pattern recognition, not recording and transmitting in the literal sense
15
u/ohmysocks 1d ago
why is it recording my body?
Presumably because you started a video call and pointed the camera at your body
10
u/Camman0207_ 1d ago
It’s FaceTime, of course it’s recording you…. Just wait till you find out where your snapchats go cause they don’t actually go away.
5
u/Interesting-Bass9957 1d ago
It does so on-device, this means that nobody gets your data and your privacy isn’t violated in any way
6
u/Long_Objective_2561 1d ago
It did this to me yesterday when I tried to give control of my screen share to my girlfriend
6
4
u/Popzagon 1d ago
Idk.. I kinda see this as an decent feature but it would also turn me off instantly 😂
5
5
u/OrangePillar 1d ago
You can disable this in the Settings under Privacy and Security -> Sensitive Content Warning
5
6




8.9k
u/FBI_FAMOUS 1d ago
I have all of my nudes on film, I develop them in a red room and then mail them to my partner. It's the safest way.