The treatment, protection, and regulation of biometric data are essential to ensure the privacy and safeguarding of users. That’s why it’s important for you to understand what needs to be taken into account to comply with the General Data Protection Regulation (GDPR) following the guidelines of the Spanish Data Protection Agency (AEPD) in the context of modern biometrics.
In the fascinating universe of biometrics, things have changed a lot in the last 15 years. In the past, it was about comparing points on your face, yes, like a sort of facial map! But today, we are in the era of neural networks, those intelligent algorithms that help us create biometric vectors that will be compared with other vectors to recognize people. It sounds a bit like science fiction, doesn’t it?
But for your peace of mind, these vectors are irreversible and don’t play well with other technologies, which means your biometric data is safe. It doesn’t matter if you lose them or they get stolen; your face cannot be reconstructed from them. And the best part is that we do all of this in strict compliance with GDPR and AEPD.
Are you wondering how all of this privacy and data protection works within a project that uses biometrics? Well, it’s a journey that begins with the conception and design of the technology itself. So, get comfortable because we are about to dive into a story of data, people, privacy, and technology. Who could ask for more?
Biometric data handling: evolution of biometrics
In its early stages, biometric recognition technologies and algorithms were based on comparing facial points. Essentially, a matrix was generated to analyze different points on the face and the distances between them. The more points analyzed, the more robust the algorithms were. In fact, a common question from clients in our first meeting was: How many facial points can your technology analyze?
Today, the biometric algorithms used to recognize individuals are based on AI, specifically deep learning systems (neural networks) that generate irreversible biometric vectors or templates.
Deep learning algorithms have revolutionized the field of biometrics, allowing them to work their magic, while irreversible biometric patterns make systems much more robust and secure.
Nevertheless, biometric recognition technologies continue to be under the scrutiny of the AEPD (Spanish Data Protection Agency), as various projects in different fields or sectors keep emerging that are either rejected or receive sanctions from the AEPD.
A significant portion of the negative reputation surrounding biometrics as a mechanism for identity verification within regulatory bodies stems from how the technology used to function years ago when the vulnerability of biometric samples seemed more compromised than it is today with the paradigm shift towards neural networks.
For us, having been in the market for over 14 years, we have experienced this evolution firsthand. Currently, our facial recognition and voice biometrics technologies are based on artificial intelligence.
Despite the AEPD’s reservations about biometric projects, safeguarding users’ biometric data is crucial for us, especially in the current climate where there is a global awareness of personal data privacy. Complying with the GDPR and the AEPD is a fundamental requirement to ensure user privacy, and at Mobbeel, we advocate for responsible development of biometric technology that respects the rights and privacy of individuals.
Irreversible Vectors That Don’t Work with Other Providers
It is of vital importance to consider privacy from the very moment you begin designing the technology, ensuring the reliability and proportionality of the solution while minimizing data retention.
Current biometric vectors are irreversible and non-interoperable among providers, ensuring privacy and preventing the exposure of biometric data.
In other words, in a hypothetical scenario where we might lose or have our biometric templates stolen, our privacy would not be compromised, as it is impossible to reverse-engineer a facial photograph from the final vector that originated that biometric template.
Furthermore, biometric engines from one provider are not compatible with those from another provider, which means these templates cannot be used in different biometric systems or for other purposes.
However, despite the security provided by new biometric systems based on deep learning algorithms, both biometric vectors and user information must be treated in accordance with the guidelines of the GDPR and the AEPD.
Impact assessment according to the AEPD
The fact that biometrics are here to stay is undeniable. We use biometric data for many purposes, such as confirming our existence, identifying ourselves, verifying who we are, tracking us, creating profiles of us, and making automated decisions, among other things. These processes range from the simplest and most common, like unlocking a mobile phone, to more high-risk activities like authorizing payments and transactions.
Therefore, when we use biometric data in a specific process, the intrusion into people’s privacy and its impact can vary. This depends on the technique used, but also on how we define the process, its nature, where and when it occurs, the context, and, above all, what we aim to achieve.
Hence, to understand how these biometric operations affect us, we must assess them in the context in which they are used and consider their ultimate goals.
Some methods of working with biometrics require people’s cooperation, while others can capture biometric data from a distance without individuals realizing it, as is the case with surveillance cameras.
For this reason, all biometric techniques must be evaluated for their appropriateness, proportionality, and necessity, considering their purpose and how they impact the rights and freedoms of individuals, as well as the risks they pose, both for individuals and society.
There are different ways to classify biometric systems, such as those based on different technologies, devices, or studied traits. However, when it comes to demonstrating compliance with the General Data Protection Regulation (GDPR) and assessing the risks to the rights and freedoms of individuals due to data processing, it is important to use classification criteria for biometric operations from the perspective of data protection and their relationship with the process in which they are applied.
Some criteria recommended by the AEPD to consider when categorizing the type of biometric operation within a treatment framework include:
- Purpose of biometric data operations in relation to the treatment’s purpose
- Legal framework.
- Scope of the treatment.
- Qualified human intervention regarding the biometric result.
- Processing of special categories of data.
- Transparency of the biometric operation.
- Subject’s free choice regarding the biometric operation.
- Suitability of the biometric operation.
- Minimum data to be captured.
- Adequacy and necessity of the biometric operation.
- User’s level of control.
- Implicit side effects in the biometric operation.
- Personal data breaches.
- Treatment context.
To ensure that biometric techniques are appropriate from the outset, as required by the GDPR, we must follow the recommendations of the Privacy by Design Guide. This involves analyzing factors to ensure compliance with the rules and assessing if the process is necessary and proportionate, which facilitates risk management.
Given that biometric operations pose special challenges, many processes may require an impact assessment under the GDPR. In some cases, prior consultation may be needed as stipulated in Article 36 of the GDPR.
Compliance with GDPR and AEPD by Mobbeel
Nevertheless, let’s examine the most important aspects to consider in ensuring that a biometrics project complies with the GDPR and the AEPD when handling this data and ensuring privacy protection. To do this, strict compliance with the LOPD is essential.
- The data flow is managed on servers in Europe, and Mobbeel acts as the data processor, complying with data retention agreements and informed consent.
Regarding the data retention policy, we are not technology custodians but rather act as data processors. The system is configured to automatically delete all personal information from onboarding processes (images, photos, and information extracted from documents) seconds after the process is completed. This deletion period can be jointly established with the client for the purpose of reviewing registration information if necessary.
- There must be voluntary consent from the user, and they should also be provided with an alternative mechanism for authentication when conducting a biometric recognition process. Furthermore, the key lies not only in the voluntary consent given by the user but also in it being informed consent, meaning the user must be aware of the purpose and implications of using biometric technology in the process they are about to undertake.
- System proportionality: The biometric recognition systems used must be individual systems designed to record and identify a single person. We emphasize this in contrast to other systems that are surveillance systems recording all users indiscriminately using facial recognition technologies. European data protection agencies do not permit projects that use recording cameras with disproportionate use for the intended project purpose, as this infringes on the fundamental rights of recorded users and deviates from the project’s purpose. Essentially, it would be as “using a sledgehammer to crack a nut.”
- Regarding Digital Onboarding processes, which involve capturing an identity document in addition to biometrics, we collect data from the ID card and the selfie to validate the identity of the person undergoing Onboarding to ensure there is no fraud. These data are used for validation and are immediately deleted. Then, an irreversible biometric vector is stored, along with a user identifier associated with that vector, which is completely anonymous and random.
Security measures, certifications and evaluations
If there is an essential certification to ensure information security, it is ISO 27001, which is an international standard that establishes the requirements for the implementation, maintenance, and continuous improvement of an Information Security Management System.
In this regard, Mobbeel is certified in the ISO 27001 standard for “Information Security Management System that supports the development, implementation, and support activities of Mobbeel’s digital identity verification solutions, both in Software as a Service (SaaS) mode and on-premises installations, according to the Declaration of Applicability dated 13/03/2023.”
Likewise, all information stored by our cloud products uses encryption mechanisms provided by the storage systems used, both for files, databases, and backups.
Additionally, our digital onboarding solution has been approved and included in the Catalog of Products and Services for Information Technology and Communication Security (CPSTIC), confirming its compliance with the highest security requirements demanded by the National Cryptologic Centre (CCN).
The tests conducted to be included in the catalog encompass various aspects, including counterfeit attempts on identity documents and attacks on the facial recognition component (screens, videos, hyper-realistic masks, professional makeup, deepfakes, etc.). All these tests are detailed by the CCN in Technical Instruction IT-14 for the certification of the Biometric Evaluation Module (MEB).
Accuracy, Reliability, and Bias
Biometric engines trained with artificial intelligence models learn to recognize faces with greater precision and fewer biases than humans (for example, Caucasians often have difficulty differentiating between Asians and vice versa). Therefore, we must begin with the premise that technology provides more reliability to identification processes than human review.
The National Institute of Standards and Technology (NIST) is a U.S. government agency responsible for evaluating the effectiveness and accuracy of facial recognition algorithms and promoting the prevention of racial, age, and gender biases.
In fact, various biometric technologies assessed by NIST are evaluated using databases that are balanced in terms of gender, race, and age to analyze the effectiveness and security of biometric algorithms.
Mobbeel’s facial recognition algorithms are regularly subjected to the FRVT (Facial Recognition Vendor Test).
Our evaluation by NIST shows results that indicate our system is tuned to allow only one false positive (accepting the wrong person) for every million attempts. This means that less than one in every hundred facial recognition processes would need to be repeated due to the rejection of a genuine user.
The technological revolution brought about by the use of biometrics for identification processes has necessitated robust regulations to protect user privacy. In this context, GDPR and AEPD are fundamental pillars that ensure privacy protection.
Throughout this article, we have explored the evolution of biometrics, from facial points to irreversible biometric vectors, all while complying with stringent regulations. We’ve demonstrated how data is handled, considering proportionality and minimizing information retention. Furthermore, we’ve emphasized the importance of accuracy and reliability, addressing bias. Finally, certifications and evaluations support the security of this technology.
For all these reasons, we must view biometrics and data protection as inseparable allies rather than irreconcilable enemies, and for us, respect for people’s privacy is non-negotiable.
Write to us if you want to implement a verification system that complies with GDPR and follows the guidelines set by the AEPD to ensure the protection of your customers’ biometric data.
I am a Computer Engineer who loves Marketing, Communication and companies’ internationalization, tasks I’m developing as CMO at Mobbeel. I am loads of things, some good, many bad… I’m perfectly imperfect.
Download Mobbscan's Dossier for Digital Onboarding
- AML / KYC compliant.
- Improves user experience.
- Reduces abandonment rate during registration.
- Automates user verification.
- Prevents document and identity fraud.