Face Recognition: How Biometric Systems Are Deceived
Facial recognition biometrics is used to unlock smartphones, access offices, pay a fare and for purchases, for remote connection to bank services. Biometric system saves time for both those who are verified and those who verify, that is why they ask you more and more often to “smile for a camera”. Yet, such verification should not be relied on completely, it can be deceived in different ways - from paper masks to AI deepfakes.
Future Crew
Faceless
In some cases, to deceive a biometric system there is no need to use high technologies. So, quite often to deceive biometric systems it is enough to print a photo of victim’s face that will be worn by an offender like a mask. Such fake can pass basic verifications like “the head is moving” and “the object is not flat”. One Brazilian offender managed to take out several loans from banks with such simple fakes. But many systems require a proof that it is a living person and not the photo that is looking into the camera. To that effect, one should wink or smile. To bypass such verifications, they make a true-to-life silicone or latex mask “by measurements” of a victim. A crook wearing such mask can imitate all the needed mimic and deceive both algorithms and humans. There is a known case when offenders swindled the victims out of more than $90 m pretending to be a French Minister. However, some systems of biometric identification still can be deceived by putting a photo of the needed person before the camera. Inexpensive solutions that work without human control are mostly exposed to this, for instance, biometric screen locks on cheap smartphones.
Video is made for future use
Systems that can not be deceived by photographs and masks can be attacked when instead of a camera image a video from the other source is planted on. For instance, a victim is captured on video beforehand, then the video is reproduced and transmitted to a biometric system. For spoofing, special firmware is installed on smartphones, and an external device can be connected to a computer instead of a web camera. Fraudsters in China using such scheme issued fake tax invoices in the amount of $76 m to unwitting public, whose images were bought on the black market.
The next level of deceit of biometric systems with the help of video is the use of deepfakes. Technically, the attack is conducted as in the previous version with video spoofing, but there is no need to video the victim - their photo and video are generated “on the fly” from publicly available materials with the help of generative AI systems. This type of attacks creates so convincing fakes that, based on Gartner’s estimation, by 2026, 30% of companies will not be able to use only “facial” biometrics in isolation of other ways of verification. Even today, large attacks are made on this technology - like a fake video call with the damage of $25 m.
From crack to makeup
Verification systems may have software and hardware errors, that is why sometimes it is possible to exploit such vulnerability and get an approval without undergoing any verifications with photo and video at all. Last year fraudsters used a technical error in Indian national biometric identification system Aadhaar for mass unauthorized debiting the accounts of citizens. The Central Bank of India had to send out instructions on additional protection of accounts with classical one-time SMS codes.
All the above-mentioned ways are usually used by fraudsters to impersonate themselves as somebody else and either to steal money or get access to important information. But there is a completely different “anti-biometry battle” front - where common people simply want to remain unrecognized in public places. Here there is no need to pretend to be somebody else, but to make face identification difficult, for instance, covering it with a hood or a scarf. But there are high-technology ways: with the help of make-up invisible to the eye, special picture on the clothes or in hands, glasses with a selected eyeglass frame pattern the person seems to be “invisible” for face recognition systems.
Those concerned about active biometrics collection by banks and commercial organizations are helped out by new Law No. 572-FZ that governs the turnover of such information. According to this law, every citizen can see on the Public Service Portal the organizations that want to identify them with biometrics and to reject the option in a couple of clicks.