The launch of the iPhone X brought along with it a revolutionary unlock feature, known to many as the facial recognition scanner. With the Face ID, the iPhone X users can unlock their phone by utilizing its front camera to scan their unique facial feature. This is a step up from the fingerprint sensor. While it does sound convenient for the users to unlock their phones, there have been some concerns especially voiced by privacy experts regarding the protection of privacy that the device itself provides. Of course, the Face ID prevents people from accessing your phone, but is the data of your facial features protected from going out from your device and falling into the wrong hands? Here, we will take a closer look at the storage of the data of your facial scan, as well as how Apple handles it in its latest smartphone.

Before we even get into the issue, we should address whether the Face ID does provide adequate protection in the first place. Apple claims that the Face ID is secure, as it should be, and that touting internal data shows a very low chance (one in a million) that the phone will unlock with a different person’s face. To unlock the phone, the device stores the user’s face scan inside on a secure enclave[1]. That means your facial data remains inside the device. However, one cannot dismiss the possibility that app developers can, albeit limited, access that data.

iPhone X

Here, some technology commentators are worried about who can have access to the sensor. One potential use of the sensor for advertisers, according to Geoffrey Fowler in his review for the Washington Post, is the fact that they can use the camera to determine exactly where the user is looking on the screen during their advertisements if you are looking at them at all. He also mentioned that the Apple’s terms for app developers forbid them from accessing the camera without permission or use the face data for advertising. Moreover, Fowler pressed the Apple executives and saw one positive change: applications that use the face data are required to publish a privacy policy. However, how well that protects the user’s privacy remains to be seen.

Apple actually allows app developers to take certain facial data from the phone and store it on the developer’s own server, as long as they ask the user for their permission, and not sell that data to third parties. Although the data access is limited, the app developers can capture a rough map of the user’s face and numerous expressions. That means the app developers can and will know when you smile, blink, or raise an eyebrow.[2]

According to the documentation about the face unlock system which Apple released to security researchers, the data of the user’s face that is accessible to the developers cannot be used to unlock a phone. To unlock the phone, a mathematical representation of the face is required as opposed to the requirement of a visual map of the face[3]. At least, that means app developers cannot remotely unlock your phone and access your data. Still, the fact that the authentication data can be shared does not sit well with everyone.

This is where the privacy protection becomes really questionable. By enabling app developers to store the user’s rough facial data on their own server, just how well can Apple enforce their privacy protection rule upon those developers? After all, advertisers need the facial data to tell whether their advertisements are effective, or to simply study their target audience.

Apple remains firm that they are sure that their enforcement tools, which are pre-publication reviews, audits of applications, and the threat of removing the app from the app store, are effective. The problem here is that Apple does not thoroughly review the source code of the applications, according to 2011 Congressional testimony from Bud Trible. Moreover, Apple relies instead on random spot checks or complaints, which is a very ineffective means of defense against privacy abuse. When it comes to privacy policies published by the application developer, it is long and complex, and it will most likely not be read by the users.

Another means of enforcement is the threat of kicking apps out of the App Store. While it is true that Apple has a good record of holding developers accountable for their violations, the challenge here is the fact that Apple has to find them first, according to American Civil Liberties Union’s Stanley. While these enforcement rules may seem strong on paper, in actuality, small application developers will most likely follow Apple’s rules. Still, larger companies like Uber, have a record of breaking Apple’s rules[4]. That brings Apple’s enforcement mechanism into question.

All-in-all, the introduction of the Face ID authentication system is, without a doubt, revolutionary in the race to smartphone supremacy. However, one should not dismiss the security flaws offered by new, and flashy technologies. Here, the user’s privacy may not be as protected as they think. While Apple does have its defense mechanisms against privacy violations from application developers, their effectiveness remains to be seen.


[1] https://www.inverse.com/article/38048-apple-face-id-scan-unlock-privacy
[2][3] https://www.reuters.com/article/us-apple-iphone-privacy-analysis/app-developer-access-to-iphone-x-face-data-spooks-some-privacy-experts-idUSKBN1D20DZ
[4] https://arstechnica.com/gadgets/2017/04/tim-cook-once-slapped-uber-on-the-wrist-for-breaking-the-app-store-rules/


© 2018 by Privacyguard.net, an LiVenture. All rights reserved. No part of this document may be reproduced or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without prior written permission of LiVentures.

(Visited 42 times, 1 visits today)

Your rating: none
Rating: 0 - 0 votes
Share This