Interview: Will Apple’s Child Safety Measures Harm Individual Privacy?

Written by

Om Moolchandani, CISO and head of research at Accurics
Om Moolchandani, CISO and head of research at Accurics

Last month, tech giant Apple unveiled plans to introduce a raft of new measures to protect children from being exposed to abusive materials, such as child pornography, across its apps and devices. These will primarily be driven by machine learning technology, which determines whether images being sent or received by children in the iCloud family meet the criteria of being abusive.

One of these measures involves using new technology in iOS and iPadOS to enable Apple to detect child sexual abuse material (CSAM) stored in iCloud photos before reporting to the National Center for Missing and Exploited Children (NCMEC). These technologies enable on-device matching, utilizing a database of known CSAM image hashes provided by the NCMEC. This database is then transformed into an unreadable set of hashes securely stored on users' devices.

While the aims of these initiatives are laudable, the use of machine learning to track the distribution of content has raised several privacy concerns. These concerns include the potential for individuals to fall victim to 'false positives' There are also fears that the technology could soon come to the attention of governments, who may look to use it for nefarious purposes. To discuss these issues and safeguards that can be put in place to preserve privacy rights during the use of such technologies, Infosecurity recently caught up with Om Moolchandani, CISO and head of research at cloud security company, Accurics. 

Do you believe Apple, to date, has done enough to demonstrate it has enough privacy safeguards in its new child sex abuse photo scanning technology? 

No. Ideally, Apple should open up its scanning software to independent security researchers to test and verify its claims. Apple says it has implemented certain safeguards around protecting the privacy of photos it will scan. Still, they have announced that this feature will also scan photos received or delivered by the iMessage system. The scans will happen on the device and not just on iCloud, making it very difficult to attest that such capabilities cannot be weaponized or abused.  

What are your main concerns regarding using machine learning technologies to undertake tasks such as image scanning in AApple'snew child safety features?

Apple claims it will only scan for child sex abuse material (CSAM) and sexually explicit material for accounts, devices and iCloud storage belonging to teenagers and those of vulnerable age. Artificial intelligence/machine learning-based scans would be performed on devices. Unless the algorithms are audited and attested by trusted third parties, it will be difficult for Apple to establish a narrative of trust with consumers. How will it guarantee other private photographs and images will not be pulled into the system? Every photo will have to be scanned in order to detect CSAM, and there is every possibility this capability could be compromised or used by nation-states for surveillance. 

"There is every possibility this capability could be compromised or used by nation states for surveillance"

What privacy issues could arise from the creation of a library of image fingerprints? Could this potentially be used by nefarious governments to target journalists/dissidents?

Absolutely, in the past, we have observed large tech companies buckling down under government pressure and opening up their platforms to facilitate information sharing and carrying out government requests. Governments have increasingly used technology platforms as state instruments to target dissenters, protesters, and those with alternate political views, and even for espionage, surveillance and to deliver and discharge statecraft.

This kind of database and freeform scanning of private image data can easily be weaponized to detect personal pictures, which can be explicit in nature between consenting parties. Still, if the parties involved are of consequence or high-profile, they can be subjected to spying and a violation of privacy. 

Are there measures you would like to see Apple put in place to ensure these child safety initiatives can be implemented safely and in a way that protects individual privacy?

The intent of detecting CSAM on devices and within data communication is a noble thought, especially for underage, teenage and vulnerable-age internet users. However, it should be optional and only enforced when there’s a legal requirement. One way to deliver such anti-CSAM and CSAM-detecting features is by working with special interest groups and authorities on new legislation. Technology platforms should not become moral authorities for enforcement unless they are required to by law. In this case, Apple isn’t required by law to scan for CSAM. 

When such services are necessary, they should undergo independent and government audits. This can instill public confidence and enforce accountability for the use of such technology. There also needs to be checks and balances available around legal ramifications if these monitoring services are misused and individuals are affected.  

More broadly, do you believe we are at a stage where we can rely on machine learning to protect people from online harms or are such technologies in need of refinement before being used in this way?

Every technology, irrespective of its readiness stage, requires acceptance and trust from society. One way to help guide this is through governance (acts, laws and regulations). For example, when telecommunication was an emerging technology, the implementation laws and regulations provided necessary governance mechanisms for society to accept those technologies. 

Similarly, machine learning and artificial intelligence might be ready to use today. Still, society probably won’t be prepared to accept them fully until authorities develop assurance and governance models for the technologies’ safe and secure use. They must also follow the principles of the constitution and the law of the land — these are the elements that provide warranties to society for trusting any entity, including technology.

What’s hot on Infosecurity Magazine?