GettyImages-1311613664

Ed Heaver, Founder and CEO, Serve Legal

Ed Heaver, Founder and CEO, Serve Legal

Digital ID and Facial Biometrics, terms that once felt distant in conversations about the retail sector, have now become integral players. The evolution over time, especially in the last year, has seen this technological duo significantly simplify identify verification, security checks and user experiences. Their potential benefits, especially in the context of age verification for products such as alcohol, tobacco, vapes, and other restricted items are substantial. However, the innovation doesn’t come without its set of challenges, leaving retailers cautiously contemplating its adoption.

Privacy concerns and data protection

The elephant in the room – privacy concerns and data protection with facial biometric technology is unmistakable. The gathering, storage, and utilisation of biometric data raises legitimate concerns about consumer privacy and is undoubtedly the biggest challenge retailers face. They must navigate this terrain with caution. Enforcing robust privacy policies and security measures is crucial to protect the sensitive information collected.

Bias and fairness

‘Shopping while black’ is a phrase commonly used in the retail industry to address marketplace discrimination based on racial profiling. While facial recognition and age estimation technologies offer tremendous potential benefits, they fail to provide 100% accuracy in ‘recognising’ a diverse crowd. Studies suggest that while these systems recognise 99% of White male faces, they struggle when identifying people of colour, especially Black women. Developers and retailers must grapple with the challenge of ensuring the fairness of their systems, particularly when it comes to issues like age estimation and identity verification. Bias and fairness are not just technical concerns but ethical imperatives that demand our collective attention.

What can we do?

· Be transparent: As retailers, it is your responsibility to, in your adoption of AI, clearly communicate to customers how their data will be used, stored, and protected.

· Protect your data: Robust data protection measures must be put in place to safeguard the privacy of consumers’ information. Adhere to existing GDPR policies and procedures to ensure best practices in data security.

· Diversify your data: Address inherent biases in facial recognition and age estimation algorithms by ensuring diverse and representative datasets are used by your technology providers. Strive for inclusivity in the data used to train these systems to mitigate the risk of unintentional discrimination.

· Conduct regular audits and assessments: Implement a system of regular independent audits and assessments for your facial biometric technology. Periodic checks like these will help identify and rectify biases that may emerge over time, ensuring that the system is fair and accurate.

· Collaborate: To tackle the unconscious bias in facial biometric technology, developers, organisers, and regulatory bodies need to come together to establish standards that prioritise fairness, transparency, and accountability.

Serve Legal logo

Serve Legal is a provider of ID and compliance testing services in the UK & Ireland.