Can You Trust Apple’s New Fingerprint Sensor on the iPhone 5S?

As I have some background in the space, several people have asked me about Apple’s new fingerprint replacement for your smartphone’s passcode. Other than IBM ThinkPads, which were hobbled by poor integration due to the complexity of Windows, this is the first large-scale consumer deployment of fingerprint sensors.

Most people think of Mission Impossible when you mention retinal scans or fingerprint sensors. Popular culture has elevated “biometric authentication” to represent the pinnacle of “high security.” Unfortunately, the very phrase “biometric authentication” is a misnomer.

Biometrics—literally a biologically derived number—in the best case provides no more than a unique identification. This unique number has no inherent security; it can be copied, stolen and misused just as easily as a password.

Used within a secure system, biometrics can be an extremely convenient alternative or supplement to “something you know”—a password or PIN number. People I’ve spoken to are intrigued by the iPhone’s new sensor because it seems faster and more secure than the lock screen PIN. But other people’s first response has been concern over privacy related to their fingerprint.

And that concern is not entirely unwarranted.

The consequences of getting security wrong are grave, because unlike a password, you can’t very well change your fingerprints! If you want to use your fingerprint to unlock your phone and all the capabilities that entails, we need to be sure that your fingerprint isn’t accessible to malicious software running on the phone.

When working on the first USB Keys, my team and I at Rainbow Technologies explored this subject in great depth (see patents 6,671,8087,272,723, 7,269,844, and 7,111,324), and developed several products embodying the resulting ideas.


With the “superkey” (pictured), we took security to its logical conclusion: it was the first thumbprint-activated USB key, which, importantly, kept your biometric permanently inside a cryptographically and physically secure container. The device would create a digital signature on your behalf when authorized to do so by the presence of your thumb on its sensor, but would never reveal the underlying capability to do so. A simple, fixed-function device such as this can be designed to meet rigorous “trusted system” objectives and even constructed to provide tamper resistance and tamper evidence.

Now this is tricky business because the key became your trusted personal agent. But we had to assume that the PC on the other side of the USB port was insecure and infected with virus or other malicious software.

Fast forward 15 years and little has changed. Our smartphones run general purpose operating systems which are complex and full of vulnerabilities that allow attackers to gain access. While Apple has done a better job of maintaining the security of their OS—albeit through sometimes draconian policies—an Android-based phone is not less likely to contain malicious software than your long-plagued Windows PC. While there are solid systems emerging to provide a trusted environment on smartphones (i.e. Fixmo, an iNovia portfolio company), such systems are not currently readily available to you as a consumer.

Bottom line: today’s smartphone is not a trusted device to which you want to reveal your fingerprint.

Which brings us back to the iPhone 5S—is it secure?

Ideally, Apple’s design incorporates a physically separate, tiny computer within the sensor package comprising a processor, memory, and cryptographic capability with exclusive access to the sensor. This is neither economically nor technically practical. The iPhone proper shouldn’t even be connected to those wires coming from the sensor except for the power supply—in that case it would have an abstracted API to determine whether your finger is present or not and to authorize payment transactions or digitally sign documents. This would be a fantastic step forward in security.

And I don’t say “except the power supply” lightly. Believe it or not, “power attacks” exist where keys can be inferred by watching the power consumption of a security subsystem, but that is a level of detail beyond this blog post. My point is that separating and protecting an authentication system from attack is of utmost importance.

Early indications are that Apple “stores your fingerprint in a secure area of the A7 processor.” Now it’s promising that they acknowledge the need to separate your biometric information from the larger, complex iOS. But I’m concerned that the design may only isolate your private biometric when its stored; that it is still processed by the iOS and on the main CPU, and thus it will become accessible to a determined hacker at some point in the future. We’ll need to wait until additional details are available.

For the moment, I’m cautiously optimistic for two reasons.

First, Apple acquired Authentec last year and, having known the founding team and the culture of that organization, I am hopeful that they will at least have been aware of and appropriately sensitive to any security shortcuts that may have been taken.

Second, because Apple has full control over their integrated design, the team certainly had an opportunity to do a good job of the overall security architecture. It will be more challenging for the Android world to achieve a secure design, although on the positive side, any vulnerabilities will be more quickly discovered in that world.