GadgetsScience & Tech

Apple to scan iPhone photos for detecting child porn

Credit: Google

Apple is seemingly gearing up to introduce a client side tool that would scan iPhones to identify child sexual abuse material (CSAM) on users’ phones. The move will seemingly encode strategic identifiers that would indicate CSAM from Apple’s end, and will hash these identifiers to run searchers on users’ iPhone photos on iCloud. This search will then return results based on the number of matches that Apple would find on a phone, and if it finds too many matches, a prompt of the result would be returned to Apple’s servers. The upcoming announcement has been tipped by Matthew Green, a cryptography and cyber security expert, and associate processor of computer science at Johns Hopkins University.

Detailing it in a series of tweets, Green wrote, “Initially, this will be used to perform client side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems. The ability to add a scanning system like this to E2E (end-to-end encrypted) messaging systems has been a major “ask” by law enforcement the world over.” He further raised concerns on what such technology could potentially do, in the wrong hands.

In simpler terms, the technology that Apple is reportedly set to release in order to crack down on child pornography will seemingly tap in to non-encrypted photos, such as user uploads on iCloud. Apple has also been a generally strong advocate of user privacy, with notable instances such as the San Bernardino incident — where Apple stood up against law enforcement’s demands that the company build a backdoor to help crack into the suspected shooter’s iPhone. A company’s previous actions, however, may not be enough to guarantee its future actions.

As Green explained in his tweets, a technology such as the one that Apple is building has the indicators of the backdoor to end to end encryption that so many lawkeeping bodies around the world have been asking for. Apple is essentially creating a technology that can help look for identifiers to specific types of content even within end to end encrypted media. This, in turn, may have serious implications to freedom of speech in encrypted conversation apps, where users with malicious intentions may use such technologies for targeting select individuals. In other words, as technology companies have stated over time, backdoors in the wrong hands may have significantly severe implications.

For now, though, it would be too early to state that such a move by Apple would be devastating towards privacy and security. However, privacy and security advocates have already claimed that such a move may not be in all-round best interests, even though Apple’s initial intentions may be noble. As Green states, “These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review. Imagine someone sends you a perfectly harmless political media file that you share with a friend. But that file shares a hash with some known child porn file?”

Going forward, it remains to be seen what the Apple CSAM inspection tool would bring to the table, once launched. As stated online, an announcement regarding the matter is expected soon.

Pranchal Srivastava