After some research in the Apple developer documentation, a potential privacy issue has come to my attention. An app only has to request user consent for accessing the device camera (once) in order to persistently track your face using the front (and the new True Depth) camera through Apple's AR Kit (https://developer.apple.com/augmented-reality/), a developer framework for creating augmented reality experiences.
Context
Using AR Kit, the app developer can very easily track your face geometry in real-time. This includes a detailled 3D mesh of your face topology (basically your facial features and expressions). Here is an app which demos this feature, so you can see how accurate the 3D mesh is (requires True Depth camera, i.e. iPhone X or above):
The framework also includes the capability of detecting your emotional response through facial expressions in terms of the movement of specific facial features (technical documentation with list of supported facial features here : https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation#topics
While the out-of-the-box expression detection is still fairly basic, there have been many advances in Facial Emotion Detection using Deep Learning (https://algorithmia.com/blog/introduction-to-emotion-recognition) and those methods would be relatively easy to apply due to the detailed 3D face mesh made available by AR Kit (which seems responsive enough to even detect micro-expressions).
The True Depth technology on iPhone X also enables accurate eye tracking. Demo video:
This allows an app to not only detect your emotional response but also to pinpoint the exact location on the screen which is triggering this response. This could be the exact headline you are reading within a news listing, or the exact face you are looking at in a photo containing multiple persons.
Potential Issues
To give you an idea of how dangerous this can be, let me give a few examples (from the top of my head) as this can be used way beyond just "serving you with better ads". One could expose a user to different types of content to test his/her reaction:
Let me repeat one of my first sentences: An app only has to request user consent for accessing the device camera (once) in order to persistently track your face. Many people have already done this at some point with apps like Facebook, Instagram and Snapchat in order to post a photo or use video calling features. Even if you rarely use those camera-enabled features, these apps can keep tracking your face persistently while you are using the app.
Any iOS developer who can confirm, oppose or nuance this?
Before someone comments on this: Yes, I realise that this problem is not iOS-specific, but the True Depth technology on the latest iPhone devices do make it a lot easier (and more accurate), thereby nudging the world one step closer to the problematic situations I listed above.
More technical information on AR Kit below:
Context
Using AR Kit, the app developer can very easily track your face geometry in real-time. This includes a detailled 3D mesh of your face topology (basically your facial features and expressions). Here is an app which demos this feature, so you can see how accurate the 3D mesh is (requires True Depth camera, i.e. iPhone X or above):
Nose Zone
Your face is the controller. And your nose shoots lasers. (iPhone X+ or 3rd generation iPad Pro required) As featured in Fastco Design, 9to5Mac, Mac Observer, and more! "It's Nose Zone taking over the world!" - Fast Company Using the power of the new TrueDepth 3d camera, Nose Zone makes your...
apps.apple.com
The framework also includes the capability of detecting your emotional response through facial expressions in terms of the movement of specific facial features (technical documentation with list of supported facial features here : https://developer.apple.com/documentation/arkit/arfaceanchor/blendshapelocation#topics
While the out-of-the-box expression detection is still fairly basic, there have been many advances in Facial Emotion Detection using Deep Learning (https://algorithmia.com/blog/introduction-to-emotion-recognition) and those methods would be relatively easy to apply due to the detailed 3D face mesh made available by AR Kit (which seems responsive enough to even detect micro-expressions).
The True Depth technology on iPhone X also enables accurate eye tracking. Demo video:
This allows an app to not only detect your emotional response but also to pinpoint the exact location on the screen which is triggering this response. This could be the exact headline you are reading within a news listing, or the exact face you are looking at in a photo containing multiple persons.
Potential Issues
To give you an idea of how dangerous this can be, let me give a few examples (from the top of my head) as this can be used way beyond just "serving you with better ads". One could expose a user to different types of content to test his/her reaction:
- See which political statements or (false) news trigger fear or anger and micro-target you with propaganda that is carefully designed to grind your gears (Russia would love this for sure)
- See how you react to (fake) dissident ideologies and assess the risk of rebellion (China would love this)
- See which people (or things) you are attracted to
- Detect your sexual preferences
- Perform a detailed psychological analysis/assessment by studying your emotional response to different photos and/or statements
- If the app owner is a hacker, they can use your emotion to better tailor social engineering attacks
- Profiling: Basically they can get to know you better than you know yourself by reading into your micro-expressions
Let me repeat one of my first sentences: An app only has to request user consent for accessing the device camera (once) in order to persistently track your face. Many people have already done this at some point with apps like Facebook, Instagram and Snapchat in order to post a photo or use video calling features. Even if you rarely use those camera-enabled features, these apps can keep tracking your face persistently while you are using the app.
Any iOS developer who can confirm, oppose or nuance this?
Before someone comments on this: Yes, I realise that this problem is not iOS-specific, but the True Depth technology on the latest iPhone devices do make it a lot easier (and more accurate), thereby nudging the world one step closer to the problematic situations I listed above.
More technical information on AR Kit below:
ARFaceAnchor | Apple Developer Documentation
An anchor for a unique face that is visible in the front-facing camera.
developer.apple.com
Verifying Device Support and User Permission | Apple Developer Documentation
Check whether your app can use ARKit and respect user privacy at runtime.
developer.apple.com
Last edited: