Published on July 11th, 2019 📆 | 2752 Views ⚑
0House lawmakers probe facial recognition programs following CBP contractor data breach
A top Customs and Border Protection official told members of the House Homeland Security Committee that an investigation into a subcontractorâs handling of images taken of travelers crossing the U.S.-Mexico could result in criminal or civil charges.
The hearing focused on reports last month that a malicious attack on a CBP subcontractor exposed nearly 100,000 images of travelers and license plates collected at U.S. border checkpoint.
John Wagner, CBPâs deputy executive assistant commissioner of field operations, told lawmakers that the vendor, which the Associated Press identified as Perceptics, violated the terms of its contract by copying those images onto its own network.
âAs far as I understand, the contractor physically removed those photographs from the camera itself and put them into their own network, which was then breached. The CBP network was not hacked,â Wagner said Wednesday.
However, CBP lacked the same level of network security for the pilot program it keeps for its main systems. As a result, no safeguards prevented subcontractor employees from sticking a portable media drive into the network and pulling those CBP images.
âThis was a standalone pilot, so it was outside of our normal network. We apparently did not have the same level of controls and audit capabilities on that, because it was a standalone, closed system ⌠Our main network has these protocols on them, but we didnât have them on this type of system,â Wagner said, adding CBP would follow up with the committee once those controls have been put in place for the pilot.
The subcontractorâs removal of those photos violated the terms of its contract with CBP, Wagner said, leading to CBP to terminate the contract and conduct an investigation. The Department of Homeland Securityâs inspector general office will conduct its own review of the breach.
âDepending on the circumstances of how the data was taken and the intentions and why and how it was used, there potentially could be criminal actions,â Wagner said.
Lawmakers scrutinize transparency, security of programs
That lapse in security procedures sparked bipartisan concerns from members of the committee, and led members to question whether facial recognition pilot programs run by other parts of the Department of Homeland Security can adequately protect sensitive biometric data or provide accurate results.
Most of the committee members generally approved the agenciesâ use of facial recognition technology, but called for greater transparency into the scope of those programs.
Committee Chairman Bennie Thompson (D-Miss.) said facial recognition could prove a valuable tool for national security, but said questions remain about the privacy, data security, transparency, and accuracy of agency programs.
âThe American people deserve answers to those questions before the federal government rushes to deploy biometrics further,â Thompson said.
Rep. John Katko (R-N.Y.) said commonplace law enforcement tools, such as fingerprint and DNA testing, went through similar vetting procedures before gaining widespread acceptance.
âMy concern is not with the efficacy of using it. My concern is that we get it right ⌠I am very concerned about the accuracy, and that was a very big thing with DNA starting out, and now the accuracy is amazing,â Katko said.
The Transportation Security Administration, for example, has partnered with CBP since October 2017 to run a facial recognition pilot at three major airports across the country.
Passengers can opt out of the biometric scan, and can request transportation security officers to look over their passport and boarding pass.
The pilot stems from CBPâs existing authority to screen the biometrics of non-U.S. citizens as they enter and leave the country. Wagner said images captured of U.S. citizens are held for 12 hours and then deleted from the system.
âThe only reason we hold it for that short period of time is just in case the system crashes and we have to restore everything,â he said.
The Secret Service is also working on facial recognition pilot on the grounds outside the White House. The pilot looks to match images of Secret Service employees, who have volunteered for the pilot, as they move around the grounds of the White House.
Joseph DiPietro, the Secret Serviceâs Chief Technology Officer, said the agency retains 30 daysâ worth of images at a time under the pilot, and delete all its images at the end of the pilot.
âWeâre trying to match the individuals that are in the pilot, the volunteers, to the people who weâre seeing in those cameras. If thereâs not match, thereâs no record. If there is a match, then thereâs a record,â DiPietro said.
It launched the pilot last December, and will conclude the pilot in August. That window gives the algorithm an opportunity to test whether it can pinpoint the same Secret Service volunteers in summer clothes and heavy winter coats.
NIST gives high marks on accuracy
As for concerns about accuracy, Charles Romine, director of the Information Technology Laboratory of the National Institute of Standards and Technology, said the âvery bestâ facial recognition algorithms that NIST has tested boast a 99.7% accuracy rate, and performed within the range of the best human examiners.
But challenges still remain in getting these facial recognition systems to accurately identify women and people of color.
Romine said those discrepancies will shrink as the facial recognition technology improves, especially if it continues at the rate that itâs progressed over the past five years, but said those challenges wonât go away entirely.
âIt is unlikely that we will ever achieve a point where every single demographic is identical in performance, whether thatâs age, race or sex. But we want to know just exactly how much the difference is,â Romine said.
NIST will further examine these challenges in a report it will release this fall.
But despite NISTâs rigorous testing of off-the-shelf facial recognition products, the current state of artificial intelligence doesnât allow for algorithms to show how theyâve arrived at their answer.
âWe have no direct knowledge of the convolution neural networks or machine learning, because these are submitted to us as black boxes and we donât examine that,â Romine said.
Copyright Š 2019 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Gloss