Combatting child abuse
Apple's PR machine has been in overdrive following its announcement that it will scan iPhones and iPads that sync to iCloud for child sexual abuse images. While no-one is complaining about efforts to protect children, there has been widespread concern (including from Apple's own employees) that the scanning mechanism could be open to government abuse. In an interview with TechCrunch, Apple rejected the charge, saying the system has a number of in-built protections - and "it's not very useful for trying to identify individuals holding specifically objectionable images." It also pointed out that scanning will happen initially only in the US - and can be prevented by not syncing photos to iCloud.
The introduction of photo scanning is one of three initiatives announced by Apple. It's also enhancing the controls in Messages to alert parents when under-13s are about to view an explicit image. And any searches for terms related to child abuse will provoke a warning message. "As important as it is to identify collections of known CSAM [child sexual abuse material] where they are stored in Apple’s iCloud Photos service, It’s also important to try to get upstream of that already horrible situation... It is also important to do things to intervene earlier on when people are beginning to enter into this problematic and harmful area," Apple said.
Apple is far from alone among the tech giants in taking action to improve protections for children. This week, Google announced a series of measures that include an end to targeting ads at children and a reduction in location-tracking for under-18s. It will also restrict children's access to pornography and give parents the ability to remove images of their children from Google search results. Last month, Instagram made new under-16s' accounts private by default which will mean only approved followers can view posts and "like" or comment. On the other hand, it's also going ahead with plans for apps designed for under-13s.