{"componentChunkName":"component---src-templates-blog-post-js","path":"/blog/mandating-fairness-and-accuracy-assessments-for-law-enforcement-facial-recognition-systems/","result":{"data":{"site":{"siteMetadata":{"title":"No Frills News"}},"contentfulNfnPost":{"postTitle":"Mandating fairness and accuracy assessments for law enforcement facial recognition systems","slug":"mandating-fairness-and-accuracy-assessments-for-law-enforcement-facial-recognition-systems","createdLocal":"2021-05-26 14:30:44.280747","publishDate":"2021-05-26 00:00:00","feedName":"Image Recognition","sourceUrl":{"sourceUrl":"https://www.brookings.edu/blog/techtank/2021/05/26/mandating-fairness-and-accuracy-assessments-for-law-enforcement-facial-recognition-systems/"},"postSummary":{"childMarkdownRemark":{"html":"<p>Previous studies and cases on facial recognition useMIT’s Joy Buolamwini’s pioneering study first drew significant public attention to bias in facial recognition by demonstrating the presence of dramatically different error rates for darker-skinned women compared to lighter-skinned men in several widely used facial recognition systems.\nFairness and Accuracy in Facial Recognition SystemsNIST has already established criteria for evaluating the accuracy and fairness of facial recognition systems as part of its ongoing Facial Recognition Vendor Tests.\nIdeally, a federal facial recognition law would impose a uniform national policy requiring prior NIST testing of facial recognition systems used in law enforcement anywhere in the country.\nApplied to facial recognition software, this rule of thumb would require that differentials in facial recognition accuracy for subgroups defined by gender, age, race, or ethnicity should be no more than 20%.\nAssessment of fairness and accuracy of facial recognition technology systems is a key ingredient in this overdue review of resource allocation.</p>"}}}},"pageContext":{"slug":"mandating-fairness-and-accuracy-assessments-for-law-enforcement-facial-recognition-systems"}}}