iPhone 13 – how Apple plans to SCAN all your PHOTOS and report you to the police

iPhone 13 – how Apple plans to SCAN all your PHOTOS and report you to the police

uh okay so apple just announced amazing technology with great intentions that might have just opened the wrong doors [Music] all right uh what’s going on guys welcome back to front page tech of course the show that gives you all latest tech news from geek that is me to another that is you uh listen kind of serious episode today so no funny intro or jokes throughout the show we don’t usually cover really serious topics like this uh but i think it’s important so uh let’s just do that let’s do the show all right so first of all first for the day story numero uno we got some actual uh tech news today before things get too serious how about a pixel exclusive for you another one that is it’s only been a few days since google themselves came out and confirmed all of our pixel 6 leaks so why not give you a new one this time for the pixel 5a now it’s important to note that this phone will only be available in the us and japan because of that chip shortage is going on right now it was effectively cancelled in every other market but according to sources i have been told the pixel 5a will launch later this month on august 26th i think that’s a thursday for 450 it is only going to be available for purchase online or in physical google stores the specs that leaked previously uh by another report that i’ve seen online those were a bit off so here is the finalized info the only color for this device is going to be mostly black the display is 6.4 inches not 6.2 60 hertz or you can force 90 hertz so not variable refresh rate for this display you just get to pick 60 or 90. the processor is going to be the snapdragon 76 5g battery clocked at 4 650 milliamp hours there is no wireless charging on this device and ram is 6 gigabytes now we’ve been told the device will have the same camera as the pixel 5 so that same camera array from last year be ip67 rated and have a headphone jack so there you go i know a lot of you have been waiting on information about the pixel 5a so uh there you have it i expect to see it later this month okay let’s do sponsor and then get a little serious [Music] hey it’s summer which means you’re probably just leaving your balls hanging out which is what you’re doing if you’re not using a vpn don’t let everyone see your balls right now we vpn has their summer sale going on get 74 off and three months of free service with the purchase of a two-year plan for only 259 a month not only is we vpn my favorite vpn of all of them they also now unblock over 350 plus streaming services that’s more than any other vpn protect yourself online don’t let your isp trace you and unblock all of your favorite streaming services make sure to click that link below to get started and of course a huge thanks to we vpn for supporting the show all right uh so last up for the day and just a heads up this is gonna be a touchy subject so hang in there hear me out i know there’s going to be a lot of discussions down in the comments below so please just respect each other be kind okay that’s all i ask right now yesterday apple announced new protections that they’re going to roll out by the end of this year within ios 15 that are made to protect children one of them is a safety measure within the messages app that will censor um explicit photos it’s youtube so there’s some stuff that i can’t say so just use context clues i guess effectively if enabled on a child’s phone if they receive an explicit photo of some kind it’ll be blurred within the messages app warning the user before they open it and then depending on the age of that child if the photo was opened their parent or guardian will be notified this goes for both receiving and sending that is good that is a good thing i think we can all agree that’s a good thing but then the waters get kind of murky see this brings us to the second part of these new protections um apple will be effectively scanning your icloud photos for child’s abuse material uh we’re just going to call that csam from here on out this can all get pretty technical pretty fast uh so we’re just gonna let me just explain the basics to you when this rolls out later this year apple will scan and detect csam images that are stored in icloud photos and report them to the national center for missing and exploited children and contact law enforcement authorities the way this is going to work is that your iphone will be able to scan images within your icloud photos against known photos in the national center for missing an exploded children’s database which is not publicly accessible like if you wanted to see what’s in that database you can’t find it if there’s a match or matches and the number of materials exceeds apple’s predetermined threshold it will go into manual review where an apple employee will determine if the images and the user in question needs to be reported now to ensure user privacy apple says that all of this scanning will happen on your devices you have to have icloud photos enabled for this to happen so i guess that’s a loophole which is to be well not enable icloud photos but if you have that enabled then before those images are uploaded to the cloud your phone on the device will scan those images against those other images on that national database to find a match apple users also called clients store photos and icloud apple would like to detect if any of these photos belongs to the national center for missing and exploited children’s database of csam photos if the number of these matches exceeds some predetermined threshold indicating systematic presence of csam apple report the user to appropriate authorities that is a direct quote from them from apple so the scans happen on device and are only viewed by apple themselves if the matches are made and the threshold is met though apple is not disclosing what that internal threshold is on the surface right this is good this is really good i mean i think we can all agree that child abuse in any form is bad and protecting children is good we all agree on that um but i don’t know man my gut i feel weird about this hear me out the real privacy concern is this listen i do not want to sound like alex jones here but i i am going to be honest when i say that this technology worries me a little bit of course we all agree that children should be protected and that apple’s announcements yesterday are for the greater good but is that the point is the goal to start with children because protecting children is good and cannot be morally argued i think this deserves its own conversation like is this a gateway to something else to something more how long before apple just announces that hey oh by the way we’re just gonna start scanning your icloud photos for drug use or pirated movies pirated music i mean yes in its current form photos have to be matched with the national center for missing and exploited children’s database the photos in there are not available to the public and it is a very specific database with a very specific purpose but who is to say that this technology this way of scanning could not be implemented or connected to a different database say like a national drug database of sorts or anything else any other form of crimes that you may have stored on your device who’s to say that that will never happen can you comfortably say that that would never happen do we accept apple being this sort of digital hall monitor or just straight up vigilante is that their responsibility and are we sure that they’ll stop at only this form of crime who’s to say that they won’t expand this as they see fit i’m just trying to create a conversation here because once these doors are open it’s going to be very hard to close them again and i’m worried that allowing apple to do this opens those doors what i mean what are we supposed to do not allow apple to do this obviously we can’t say anything or argue against protecting children that would be morally wrong and maybe that’s the point because if we argue against it well then it looks like we have something to hide or we condone child abuse it’s tough it’s a catch-22 i mean i’m already seeing people on twitter shaming or guilting other users who argue against this technology like what are you not against pedophiles how could you support child abuse this way of using guilt and shame to silence opinions that are opposite of yours is lame and in this case naive at best of course i don’t support any type of abuse against children i think i made that very clear that is not what i’m arguing i’m arguing that the acceptance of this technology may lead to deeper privacy concerns deeper privacy issues and open up the doors to further scans of things less nefarious especially in the wrong hands and don’t even get me started on what would happen if a government had a backdoor to this and i get it apple is not the first company to implement something like this google does it facebook does it but google and facebook do not promise the same amount the same level of privacy that apple does um and i think that hurts apple again this is a this is tough because of course none of us condone child abuse we don’t want that this is for the greater good um but i do feel weird about it and uh that’s it let’s talk about it down in the comments uh and i’ll see you guys in the next episode
rn

iPhone 13 - how Apple plans to SCAN all your PHOTOS and report you to the police

rn

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *