Close

Giriş

Close

Register

Close

Lost Password

NSPCC: Apple Hiding True Extent of CSAM Figures

NSPCC accuses Apple of underreporting CSAM figures in the UK.

The National Society for the Prevention of Cruelty to Children (NSPCC), a prominent UK child protection organization, has alleged that Apple is vastly underreporting the number of Child Sexual Abuse Material CSAM cases on its platforms. According to the NSPCC, the CSAM figures reported by Apple globally are far lower than the incidents detected in England and Wales alone.

In 2023 Apple reported 267 CSAM cases to the National Center for Missing & Exploited Children (NCMEC) in the US stating that this covered incidents. However according to the NSPCCs investigation Apple was linked to 337 offenses in England and Wales from April 2022 to March 2023 indicating a difference in the reported CSAM numbers, by the technology company.

Concerns Over Apple’s Transparency in Reporting CSAM Figures

NSPCC: Apple Hiding True Extent of CSAM Figures

Richard Collard, who leads child safety policy at the NSPCC voiced his worries about the reporting of CSAM cases, by Apple. He highlighted the disparity between the frequency of child abuse image incidents on Apples platforms in the UK and the minimal number of reports they share globally with authorities.

The NSPCCs investigation raises doubts about Apples dedication to addressing CSAM concerns on their platforms. In comparison to tech giants, like Google and Meta who disclosed a volume of CSAM cases in 2023 Apples reported numbers seem low given its extensive global reach and user base.

Share

Related Contents

0
0

    Leave a Reply

    Your email address will not be published.

    Thanks for submitting your comment!