Supply Chain Market Research - SCMR LLC
  • Blog
  • Home
  • About us
  • Contact

Apple’s Touchy Subject

8/11/2021

0 Comments

 

Apple’s Touchy Subject
​

​In 2015 two violent extremists attacked the San Bernardino Inland Regional Center killing 14 and injuring another 22.  The attackers were killed during a later police shootout and the phone of one was recovered.  The FBI requested that Apple (AAPL) create a software program that would enable the agency to unlock the iPhone 5C it recovered and other devices operating under iOS 7 and recover information under the ‘All Writs Act’, a piece of legislation that originated in 1789, after the FBI found that the phone had been rigged to delete all of its information after ten attempts to discover the password.  Apple declined the request but the day before the hearing the FBI said it found a 3rd party who was able to unlock the phone correctly.  Apple has faced a number of similar situations in terrorist or drug related cases but has been able to avoid tainting its staunch privacy mandate, one that its users find quite attractive.
Last week Apple announced that it would roll out a system called neuralMatch that will be an update to iOS 15 later this year.  The simple part of the system would watch for searches relating to child sexual abuse through Siri and the Apple search application and would direct the user toward ways to report such abuse or how to get help with such a problem.  The second part of the system is a bit more invasive, essentially giving parental control to the messaging application that would blur sexually explicit images for any user under 18 and would notify parents if a child under 12 years old is trying to view or send such pictures.  But the third part of the system is where much of the controversy lies as it scans the images in the iCloud Photos folder, and if it finds such material reports it to Apple moderators who have the option to pass it on the National Center for Missing & Exploited Children.
The messaging portion of the system gives parents the choice whether to opt in and if so, an image processor that has been designed to focus on pornography will look for sexually explicit images in the app, obscures the picture, and asks the user if they really want to view or send the image.  In the update family account systems will also check to see if the user clicks through the warnings and is under 13 years old, at which point they will receive a warning that their parents will receive a warning, although the actual image will not be seen by the users parents.  Parents will be notified if the user continues through the process, either viewing or sending such content, although they will not if the user only receives the images but goes no further.  We note that this system actually checks images on the user’s phone rather than in the cloud, which some privacy groups look at a violation of the very privacy rules that have been cited in court cases.
The Apple system does scan images in iCloud accounts, so if you sync to same, all of your images reside on iCloud servers, which are then scanned against a list of ‘known CSAM’ and if enough matches are found the account is sent to a moderator who can confirm, close the account and notify the authorities.  Apple is certainly not alone here as many social media companies use a Microsoft (MSFT) tool to scan servers for CSAM but until now Apple has resisted, scanning only iCloud Mail.  If the new system is implemented it will differ in that it will check images directly on the user’s iPhone rather than on servers, which brings the privacy question a bit closer to home.    Interestingly, the software that actually does the ‘checking’ is not really ‘looking’ at photos as would a facial recognition system, but breaks the images into a hash, or digital signature (a big list of numbers) that represents the key features of the image.  That hash value is compared against a list of millions of image hashes provided by the NCMEC and if a match is found an alert is placed on the account.
Apple is want to point out that it is only looking at photos that are synced with an iCloud account, meaning by turning off a user’s iCloud sync will disable the system completely and no image hash generation will occur, but while the system will certainly work toward outing those that traffic in CSAM to a larger degree, others say that upping the game by adding surveillance directly to a user’s iPhone opens a pathway to a way to break end-to-end encryption[1] and disable the privacy and lack of a backdoor that Apple has worked so hard for in the past.  There is some nuance here as end-to-end encryption does exist in Apple’s messaging application which means Apple will have no access to the messages themselves, nor will parents, but that is not the case on iCloud servers (not only Apple) which makes it possible for companies to run such CSAM searches. 
This difference is Apple’s defense against criticism of the new system, along with the claim that only 1 in 1 trillion accounts will be incorrectly tagged, but there is great fear that other countries will use the Apple precedent as a way to disengage encryption in the name of fighting terrorism or misinformation, and that Apple itself has the power to modify the safeguards it has put into the system.   Apple responds as follows:
Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.
This is a very complicated issue that almost all admit is misunderstood by most and has the potential to become highly politicized.  It is a very important question that we could not even begin to feel we understand or answer, but when change is underway, we thought it important for our readers to be aware of. 
For further reading:
https://www.missingkids.org/HOME; https://www.apple.com/privacy/docs/Building_a_Trusted_Ecosystem_for_Millions_of_Apps.pdf;
https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf;
https://appleprivacyletter.com/


[1] End-to-end encryption is a process by which there is no ‘key’ to decrypt the date being sent between two parties.  Lesser systems encrypt in the same way but a company holds a key that would allow encryption.
0 Comments



Leave a Reply.

    Author

    We publish daily notes to clients.  We archive selected notes here, please contact us at: ​[email protected] for detail or subscription information.

    Archives

    May 2025
    April 2025
    March 2025
    February 2025
    January 2025
    January 2024
    November 2023
    October 2023
    September 2023
    August 2023
    June 2023
    May 2023
    February 2023
    January 2023
    December 2022
    November 2022
    October 2022
    September 2022
    August 2022
    July 2022
    June 2022
    May 2022
    April 2022
    March 2022
    February 2022
    January 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    October 2020
    July 2020
    May 2020
    November 2019
    April 2019
    January 2019
    January 2018
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    November 2016
    October 2016
    September 2016

    Categories

    All
    5G
    8K
    Aapl
    AI
    AMZN
    AR
    ASML
    Audio
    AUO
    Autonomous Engineering
    Bixby
    Boe
    China Consumer Electronics
    China - Consumer Electronics
    Chinastar
    Chromebooks
    Components
    Connected Home
    Consumer Electronics General
    Consumer Electronics - General
    Corning
    COVID
    Crypto
    Deepfake
    Deepseek
    Display Panels
    DLB
    E-Ink
    E Paper
    E-paper
    Facebook
    Facial Recognition
    Foldables
    Foxconn
    Free Space Optical Communication
    Global Foundries
    GOOG
    Hacking
    Hannstar
    Headphones
    Hisense
    HKC
    Huawei
    Idemitsu Kosan
    Igzo
    Ink Jet Printing
    Innolux
    Japan Display
    JOLED
    LEDs
    Lg Display
    Lg Electronics
    LG Innotek
    LIDAR
    Matter
    Mediatek
    Meta
    Metaverse
    Micro LED
    Micro-LED
    Micro-OLED
    Mini LED
    Misc.
    MmWave
    Monitors
    Nanosys
    NFT
    Notebooks
    Oled
    OpenAI
    QCOM
    QD/OLED
    Quantum Dots
    RFID
    Robotics
    Royole
    Samsung
    Samsung Display
    Samsung Electronics
    Sanan
    Semiconductors
    Sensors
    Sharp
    Shipping
    Smartphones
    Smart Stuff
    SNE
    Software
    Tariffs
    TCL
    Thaad
    Tianma
    TikTok
    TSM
    TV
    Universal Display
    Visionox
    VR
    Wearables
    Xiaomi

    RSS Feed

Site powered by Weebly. Managed by Bluehost