Corsight AI, today announced its facial recognition technology ranked top in the most recent facial recognition technology benchmarking test (FRVT 1:1 Verification), conducted by the National Institute of Standards (NIST), for proactively reducing bias. The results show its software achieves revolutionary advances in reducing discrimination within the algorithm.
NIST is widely recognized for setting national and international standards within the facial recognition industry. It aims to address pervasive issues and identify standards for bias and discrimination within its testing protocols.
In its testing, bias is measured according to the false match rate (FMR) of different demographic groups. If the FMR for groups of black male/female subjects is higher than the FMR for white/male subjects, it means that the algorithm is more likely to misidentify subjects within that demographic.
With Corsight’s latest algorithm, the FMR for black female/male subjects is now identical to that of white female/male subjects, meaning Corsight AI has much less bias than the competition, even when it is being measured against other leading facial recognition platforms.
Cortright is currently used as NIST’s reference point in its reports – serving as the standard for the industry as a whole. These results represent not only an exciting outcome for Corsight but also feature a crucial development for this emerging technology.
Tony Porter, Chief Privacy Officer at Corsight AI comments, “We’re absolutely thrilled with these results. We’re thrilled because this is another step forward in countering claims that bias is damaging the effectiveness of facial recognition technology. The argument that facial recognition software is not capable of being fair is frozen in time and the performance of Corsight’s latest submission demonstrates that.”
He adds, “In relation to our most recent privacy release*, this NIST test goes to show that Corsight is stepping towards a 360-degree approach to trustworthy facial recognition. Our solution is accurate, it’s fast, and now according to NIST, it can deliver fairness. We are proud of the results but do not consider it as “bias solved.” We are working hard to extend the test cases with NIST and provide a balanced real-world solution.”