Check samples on Kaggle
The Photo Print Attacks Dataset offers a specialized resource for enhancing Presentation Attack Detection (PAD) models, especially suited for assessing liveness detection. With over >5,000 unique individuals, this dataset is an invaluable asset for AI developers aiming to improve anti-spoofing capabilities. Used by both iBeta and NIST FATE, this dataset is structured to support advanced AI model training focused on detecting photo print attacks
This dataset includes over >5,000 print photo attacks, featuring diverse participants with a balanced representation of gender and ethnicity. Each attack is captured in a 10-20 second video that meets standards for liveness detection, including high-quality imagery and realistic color to simulate authentic conditions
The data collection process involved a large group of participants and carefully staged photo print attacks. Each attack video employs a zoom-in effect, as specified by NIST FATE, enhancing the AI’s ability to recognize print attacks versus live subjects. Flat photos were used to ensure accuracy, with no bending or skewing, providing a consistent, straight-on view of the camera
This dataset is ideal for developing and refining liveness detection models that need to reliably differentiate between genuine selfies and photo print attacks. It’s particularly beneficial for organizations working on facial recognition and biometric authentication, aiming to improve the accuracy of spoof detection in PAD systems
A sample version of this dataset is available on Kaggle. Leave a request for additional samples in the form below
This dataset is specifically designed for assessing liveness detection algorithms, as utilized by iBeta and NIST FATE. It is curated to train AI models in recognizing photo print attacks targeting individuals. These attacks encompass Zoom effects, as recommended by NIST FATE to enhance AI training outcomes
Best is used for:
Tell us about yourself, and get access to free samples of the dataset
© 2022 – 2024 Copyright protected.