Artificial Intelligence Now Can Fool Biometrics

Date Entry
November 15, 2018

Five researchers recently announced that they'd developed a way to use AI to mimic human fingerprints - which has recently become a popular security format.  Philip Bontrager of the New York University engineering school led the team and the group ultimately named their project DeepMasterPrints.

DeepMasterPrints is a utility that generates fingerprints intended to fool security systems, and it looks to be able to replicate more than one in five actual fingerprints in a biometric security system.  This is because biometric security systems don't analyze the entire fingerprint - only a few key points and the relationships between them.  As it turns out, fingerprints may be more similar than we previously thought.

The system was developed by feeding actual fingerprints into an AI process that then created unique fingerprints, but kept certain areas more consistent.

In a similar way to how a script could run a "dictionary attack" and run millions of passwords through a system to see what sticks, DeepMasterPrints runs millions of generated fingerprints through a system to see what works.  Fortunately, unless the system was optimized for a cell phone it's not likely to work.  So, for now, your pictures and texts are still safe behind your phone's fingerprint sensor.

The point of Bontrager's exercise was to encourage development of more secure biometric security systems.  His suggestion is that developers come up with ways to confirm that a human is present when granting access to confirm that the system isn't being compromised.

This article was based on a November 15, 2018 Gizmodo article by Jennings Brown.

File To
Archived
File To
Current News