Facial recognition software that showed statistically undetectable bias against women and people of color in a recent government study will be put in place soon in systems used on travelers at airports and entry into the U.S., a top official from the U.S. Customs and Border Protection told lawmakers Thursday.
“We’re using a high performing vendor,” John Wagner, deputy executive assistant commissioner for the border agency’s Office of Field Operations, said at a House Homeland Security Committee hearing.
Still, the move didn’t reassure many Democrats, who said they were still worried about racial and gender bias in the technology, as ...
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.