Exploring visual and biometric spoofing in modern cyber threats.
In February 2025, Vidoc Security Lab’s co-founder Dawid Moczadlo caught an applicant using deepfake software to hide their face during a remote interview. He muted the applicant’s audio for privacy reasons, but it’s clear that the applicant is not being cooperative especially when asked to perform a simple hand gesture over their face.
Note that this is one of the lower end examples of deepfake technology.
| Visual Threats |
|---|
| Deep Fakes | Malicious actors will use AI to alter, clone, or create voice samples, images, and/or videos to replicate the biometric authenticators of an identity or create a new identity. Voice and camera analysis in biometric authentication checks alone will not be able to reliably guard against deepfakes. |
| Injection Attacks | Images or videos are digitally inputted into the identity verification process, often using a virtual camera emulator to bypass camera checks. More complex attacks may involve hacking the vendor’s API or SDK to directly inject data. |
| Presentation Attacks | An attacker will manipulate cameras, especially those with unreliable automated checks with 2D printouts, life-like 3D masks, videos, and deepfakes over the screen. |
| Evasion Attacks | Attackers physically evade, hide, or alter their true features by creating dramatic/distorted facial expressions or using heavy/complex makeup to reduce the accuracy of a person or system’s recognition system. They will also use hair, glasses, or some article of clothing like a hat, hoodie, or scarf to block specific parts of their faces to prevent clear image capture. |