Mandating fairness and accuracy assessments for law enforcement facial recognition systems


Image Recognition

ARTICLE SOURCE

Previous studies and cases on facial recognition useMIT’s Joy Buolamwini’s pioneering study first drew significant public attention to bias in facial recognition by demonstrating the presence of dramatically different error rates for darker-skinned women compared to lighter-skinned men in several widely used facial recognition systems. Fairness and Accuracy in Facial Recognition SystemsNIST has already established criteria for evaluating the accuracy and fairness of facial recognition systems as part of its ongoing Facial Recognition Vendor Tests. Ideally, a federal facial recognition law would impose a uniform national policy requiring prior NIST testing of facial recognition systems used in law enforcement anywhere in the country. Applied to facial recognition software, this rule of thumb would require that differentials in facial recognition accuracy for subgroups defined by gender, age, race, or ethnicity should be no more than 20%. Assessment of fairness and accuracy of facial recognition technology systems is a key ingredient in this overdue review of resource allocation.