iOS 15 Child Safety Explained: New and Top Features to Know – The Android Soul

Posted: September 27, 2021 at 6:04 pm

Since its inception, Apple has always prioritized privacy and security. It is not satisfied with making best-in-class, best-selling smartphones and tablets, it also wishes its customer to feel comfortable using their devices. With iOS 15, Apple is bringing another layer of security to their devices, thanks to Child Safety.

Today, we will tell you pretty much everything you need to know about Child Safety and help you understand how Apple is bringing this filtering system on board. Now, without further ado, let us check out what Child Safety is and how it can impact your day-to-day life.

Child Safety is an iOS 15 feature that is meant to curb the distribution of Child Sexual Abuse Material (CSAM) and help protect children from predators who use communication tools to exploit them. Apple is set to roll out the feature first in the United States and then other regions would follow. It is a three-fold feature that has the potential to put the parents minds at ease. As always, this new addition is being implemented free of cost. The only pre-requisite being iOS 15.

Apple is working closely with top child safety experts to make the system as fool-proof as possible.

Related: The best Android games for kids [Age group-wise]

The Child Safety feature set was not built overnight; Apple has perfected it over many iterations. In its current form, Child Safety in iOS 15 is broken into three sections. The first part is communication safety in the default Messages app. The second one is CSAM scanning in iCloud Photos. And finally, the third part comes in the form of safer guidance in Siri and search. Below, we will learn what these mean.

The default Messages app in iOS is arguably one of the most compelling parts of an Apple device, which is why this filtering system has the potential to be particularly effective. Communication safety adds an added layer to your childs Messages UI. When they receive a sexually explicit image, the photo is blurred and they are told of the sensitive nature of the photo. If they still wish to continue, there is a provision to send an alert to parents or guardians. The same filtering system comes into play if the minors choose to send sexually explicit photos. The parents are alerted if their child chooses to ignore the warning and pressed send.

This feature is exclusive to accounts set up as families in iCloud in iOS 15, iPadOS 15, and macOS Monterey.

CSAM detection in iCloud Photos is the most widespread implementation of the Child Safety program. It affects every user who backs up their photos to iCloud, which has caused a bit of a ruckus in the media lately. To detect and prevent the distribution of Child Sexual Abuse Material (CSAM) Apple is making use of advanced cryptography techniques. After an image is detected and proved to violate the guidelines, iOS 15 and iPadOS 15 devices will report the instances to National Center for Missing and Exploited Children (NCMEC). NCMEC is the hub for all CSAM-backed reports and works closely with law enforcement offices across the United States to punish the offenders.

The scanning of photos might raise some red flags, but Apple ensures that it does not scan all photos on iCloud. It only scans photos that fit a certain profile. The profile is given by NCMEC and other child safety services. On-device machine learning plays a massive role in CSAM detection. It limits exposure and accurately analyzes sensitive photos to offer parents peace of mind.

Related: 26 Zoom Games for Kids

While CSAM detection in iCloud and Communication safety focus on shielding under-age children from exposure. Safer guidance in Siri and Search is teaching users about problematic behavior. If a user employs Siri voice search or regular Search to search for queries related to Child Sexual Abuse Material, iOS 15 intervenes with a warning message. The warning explains that interest in such topics is problematic and harmful. iOS 15 even provides resources to help someone with the issue.

Child Safety is inherently built into iOS 15. So, you cannot shrug it off for good. However, there are a couple of conditions that you must meet to keep the service working as intended. First, the Communication safety feature is exclusive to accounts that are set up as Families. Isolated accounts do not have this feature. Also, as of now, CSAM detection only comes into play in iCloud photos. If you do not have iCloud, iOS 15 will not scan your private library for CSAM.

There is no distinct toggle to turn off the service, only indirect ways to disabling it.

No, Communication safety cannot be accessed by standard account holders. Your account is required to be set up as Family in iCloud settings before you can make the Child Safety feature work in iOS 15.

Apple is already facing a lot of flak regarding the photo scanning announcement. However, the company ensures that it has no intention of scanning all your photos and will not have access to any of your photos or messages. It will simply crosscheck your photos and messages with known CSAM profiles and report to the authorities if necessary.

No Apple will not scan all your photos in your iOS 15-powered iPhone. It will use on-device machine learning to cross-reference the photos you are uploading to iCloud, taking note of the suspected CSAM entries. If you do not upload photos to iCloud, the CSAM service is disabled by default.

CSAM detection applies to both old photos and new ones. If someone has media files that hint at sexual abuse of children, CSAM would notify the authorities right away and appropriate actions would be taken.

CSAM detection for photos in iCloud works by cross-checking photos with known CSAM images. If no matches are identified then the photo is given a green light. So, if you are worried about Apple reporting nude/semi-nude images of your children on your smartphone, rest assured that the company is working with multiple organizations to make sure no false reports are registered.

Apple has always been a vocal supporter of privacy, so much so that it prides itself as the most private company out there. So, when Apple came up with on-device scanning, alarms started to go off. Apple, however, has assured that the Child Safety system will not be used by the government or any other party to scan through users personal files. The entire system is completely auditable and there are many fail-safes to protect user data.

RELATED

See original here:

iOS 15 Child Safety Explained: New and Top Features to Know - The Android Soul

Related Posts