Apple will reportedly be delaying the rollout of its plan to scan iPhones and other devices for child sexual abuse material (CSAM) following backlash from privacy advocates, journalists, and the general public.
9to5Mac reports that Apple has announced that it will be delaying its recently announced child safety features including CSAM detection for iPhones following feedback from privacy advocates and experts.
Apple said in a statement:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple has recently faced increased scrutiny over its decision