http://www.digitaljournal.com/tech-and-science/technology/q-a-where-is-biometric-technology-heading-next/article/538914

Q&A: Where is biometric technology heading next? Special

Posted Dec 13, 2018 by Tim Sandle
Facial recognition has its supporters and detractors. Is facial recognition as force for good in solving crimes or a violation of human rights? Are other types of biometrics better suited? Martin Zizi of NeuroPrint weighs in.
File photo: Biometrics in use.
File photo: Biometrics in use.
Most conversations around the use of biometrics like facial recognition trigger questions about human rights and privacy. There are other important considerations, such as what happens when biometric data is compromised? Plus there is the misidentification problem? And also with understanding where is this biometric data stored and who has access?
According to Dr. Martin Zizi of Aerendir there needs to be more conversation and focus around biometric technology that prohibits hacking events, the compromising of privacy and/or is something that is not originated from the outside (via face, fingerprints, and so on).
Digital Journal caught up with Zizi to discuss his thoughts on biometrics in general.
Digital Journal: How big is facial recognition set to become?
Dr. Martin Zizi: Facial recognition use is on the rise, and according to a forecast by MarketsandMarkets, it is expected to garner $41.8B by 2023. According to the summary, the government sector will be the key driver of the overall market’s growth (mainly immigration and travel applications), but consumer technology, financial services, healthcare, the automotive industry and beyond will also be affected by facial recognition incorporation. The latter portion of this – often referred to as the Internet of Things (IoT) – is where the industry needs to be careful, as it raises significant privacy and security concerns.
DJ: What are the main applications?
Zizi: As of now, law enforcement is the main application. In fact, according to a 2016 report published by Georgetown University’s Center on Privacy and Technology, almost half of all U.S. adults are in facial recognition networks because they have licenses in states where police are permitted to search driver's license photos for face matches.
DJ: What are the main privacy concerns?
Zizi: We know that the government can use technology to keep tabs on its citizens. In March 2017, a House oversight committee hearing was told that over half of all adult Americans’ photographs are stored in facial recognition databases which can be accessed by the FBI. 80% of those photographs came from non-criminal sources like passports and driver’s licenses. Even more worrying was that the algorithms used to identify people are wrong 15% of the time, and are more likely to misidentify ethnic people, or females.
This is where facial recognition found its place in Facebook’s arsenal of tools. By adding your face to their database, when a friend uploads a photo, Facebook can suggest tagging you with minimal effort. The feature was decried as a cynical means for Facebook to gather more data on their users — especially since you were auto-enrolled in the feature. The European Union even decided that Facebook’s facial recognition was an invasion of users’ privacy and blocked its expansion inside the EU. At the end of the day, privacy is about the user’s right to decide what is public and what is private. Choice is the key word.
DJ: What other developments are there with biometrics?
Zizi: Live biological signals are the next frontier. They are, by definition, impossible to spoof.
Biometry operates in 3 different modalities. They can be a human feature like your fingerprint, iris or facial features. They can be behavioral like the way we walk and run, or the way we hold our phones. Behavioral is based on the repetitive way we behave, and such data can be used to build motion repertoires, or a collection of habits, which define us. Or they can be physiological, which is the data that comes from the live function of our bodies. Examples of the latter would be voice recognition, heart rhythms and brain activities.
DJ: What happens in situations where biometric data is compromised?
Zizi: When one’s biometrics are compromised, they are literally kicked out of the recognition system. If one’s face or voice has been compromised, it may be used to impersonate them all over the internet, so the only solution is to block this identifier – and as the person cannot get a new face/voice, etc., he/she is essentially out. This is the main reason why we cannot store biometrics in massive databases, it is too dangerous.
DJ: Is biometric data at a particular risk from hacking?
Zizi: Most biometrics solutions – from facial to fingerprint – have a high risk of being hacked. Simple internet searches show that YouTube offers 340 tutorials on how to hack a fingerprint sensor.
For example, in 2017, a Financial Times reporter based in London asked his twin brother to call his bank (which uses voice recognition in some of their systems) and he was able able to successfully spoof the system.
There are also more subtle ways of being compromised outside of any hacking activity. In the case of face ID for example, anyone can get into your phone simply by pointing it at you. For thieves, this feature will make the iPhone X more tempting to snatch since, if the phone is locked, they will have an easy way to open it before running off.
Computer Graphic Imaging (CGI) also compromises facial recognition. Recently, adversarial neural networks have been programmed to annihilate the training of another neural network that was trained to recognize faces at 98% accuracy. By removing just 3-5 pixels from a 10 million pixels image, accuracy fell to zero.
In a follow up interview, Zizi’s explains his NeuroPrintTM technology, which is a cloudless physiological biometric technology that measures micro-vibrational patterns in a user’s hands. See: “Q&A: Vibrational biometric system for identity recognition.”