x
Breaking News
More () »

Local experts and advocacy groups speak out on Apple's scanning of iPhones for child sexual abuse

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

Apple is introducing safety features to limit the spread of images with child sexual abuse causing concern from local cybersecurity experts and support from child protection groups.

One of these features is scanning images on Apple for child sexual abuse.

Apple said they will detect child abuse material with user privacy in mind. The system will utilize cryptographic technology before in image is stored in iCloud to see if it matches with child sexual abuse.

Apple then manually reviews each report to confirm a match with child sexual abuse material, disables the user’s account and sends a report to the National Center for Missing & Exploited Children (NCMEC).

There will also be an update to iMessage to warn children and their parents when receiving or sending sexually explicit photos. However, the feature is designed so that Apple does not get access to the messages.

Ron Jones, a cybersecurity instructor with Harrisburg University said though this effort could impact the distribution of these inappropriate images, the update is a violation of citizens' constitutional rights.

"The founding fathers said that no one —  including the government should go into your personal effects and take judgment, " said Jones. "You don't need probable cause and you don't need a search warrant."

These concerns also stand out to Chris Kirchner, the executive director for the Children's Advocacy Centers of Pennsylvania (Penncac), a statewide organization that provides resources and training to stop abuse and heal children.

" I think it's good to talk about those concerns. We have some fantastic technology available to us and you know we wanna make sure it's used correctly," said Kirchner. However, Kirchner has confidence in Apple's process.

Kirchner said Apple's plan could stress agencies like NCMEC with a huge caseload, but it's a move forward to protect children.

"I think it's a great step in the right direction," Kirchner said. "If technology can help us keep predators away from our kids, why wouldn't we — why wouldn't we take advantage of that?" said Kirchner.

It is less than one in one trillion per year that the system would incorrectly flag any given account, according to Apple.

These features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

Before You Leave, Check This Out