New Venture Shines Spotlight on Facial Recognition Collection and Databases

Facial recognition software, and the databases that the images end up in, remain controversial. At the time of writing, a petition to stop the Met Police using facial recognition surveillance is a few thousand signatures short of being completed, whilst another by privacy activists Liberty calls on a total ban of facial recognition technology as “it breaches everyone’s human rights and is discriminatory and authoritarian.”

What do we know about facial recognition software, its use and how the data is stored? For the latter point, if giant databases of our faces are being held, who are they held by and for what use? One organization looking to educate more on the concepts of face recognition databases is the Electronic Frontier Foundation (EFF), who this week devised a new service and quiz based on this concept.

Named Who Has Your Face, the campaign has reviewed thousands of pages of public records to determine as much as possible which government photos of US citizens, residents and travellers are shared with which agencies for facial recognition purposes.

“These public records have shown us that biometric database sharing is widespread and completely unregulated—and this is still just a partial picture,” said Clare Garvie, senior associate with the Center on Privacy & Technology at Georgetown Law, which brought the campaign to life with the EFF.

“Americans deserve to know how their biometric information is being used, especially when it may put them at risk of being misidentified as a criminal suspect.”

“It should be easy to learn the full list of which entities have personal data that you’ve been required to hand over”

In putting the campaign together, the EFF learned that government agencies—including Immigration and Customs Enforcement, the Department of Homeland Security and the FBI—could all have some access to these photos. “However, despite hundreds of hours of research, it’s nearly impossible to know precisely which agencies are sharing which photos, and with whom,” it said, as each state's Department of Motor Vehicles (DMV) shares access to their photos differently, depending on agreements with local police, other states and federal agencies.

“Here’s the truth: it should be easy to learn the full list of which entities have personal data that you’ve been required to hand over in exchange for a driver’s license or for re-entry into the country after visiting family abroad—especially when that’s a photo of your face,” said EFF surveillance litigation director Jennifer Lynch.

“Most people realize that their photos are scanned into a database, but they don’t realize this effectively makes them part of a perpetual police line-up. That’s what’s happening to millions of people, without their knowledge, and it’s practically impossible to opt out.”

EFF digital strategist Jason Kelley said in a blog that the number of people affected by face recognition is staggering, and it counted at least 27 states where the FBI can search or request data from driver’s license and ID databases. Meanwhile, there are at least 43 DMVs using facial recognition, with only four of those limiting data sharing entirely.

“That puts two-thirds of the population of the US at risk of misidentification, with no choice to opt out,” he said, adding that the number “is unconscionable” and these data-sharing agreements “violate the privacy of thousands of people every day.” 

Taking the Test
I took the quiz to get a determination of how well shared my images were. It consists of a five-part test, starting with a question on if I have a US driver’s license or state photo identification card. That is a no. The next question relates to where I live; well I live in London, and if I have a US passport or visa, a positive on the latter. Next it asks if I have signed up for the TSA PreCheck Program, which was designed by DHS to allow approved travellers to pass through an expedited security screening at certain US airports, and finishes with a question on whether I have ever applied for a US government job which required a photo? No on the last two questions.

With several no answers, and the fact I am not a US citizen and don’t hold a US driving license, I should be a low score? Turns out not, as the DHS, the FBI FACE Services and Department of Defense will have my image because of my US working visa. Not to mention all the times I have been photographed entering the US at border control.

Back in 2015, security researcher Runa Sandvik demonstrated how she was able to request all of her photos in a Freedom of Information Act request. She said that “the information that is collected by these programs is shared with federal, state and local agencies. It is unclear how long the information is retained for.”

“I would imagine that all of the current facial recognition suppliers and practitioners will soon have to abide by the same rules as everybody else”

Will this effort make any difference to governments and better protect citizens? Brian Higgins, security specialist at Comparitech.com, said that he believed that facial scans may well fall under the NIST definition of Personally Identifiable Information (PII) section 1, and in which case their possession, retention and storage would all be governed by the General Data Protection Regulation (GDPR).

“In fact, GDPR article 4(1) states that physical and physiological factors are considered to be ‘personal data’ but none of this has been tested in the courts yet, but I can’t think of anything more personally identifiable than your face, so I would imagine that all of the current facial recognition suppliers and practitioners will soon have to abide by the same rules as everybody else,” he said.

Tim Mackey, principal security strategist at the Synopsys CyRC (Cybersecurity Research Center) said that the public doesn’t typically think in terms of themselves as being subject to a search or being part of a potential dataset of possible suspects. “With facial recognition everyone becomes a suspect until filtered out, and how that filtering process functions can be subject to unknown biases based on how the underlying machine learning models were trained,” he said.

Is it a case of whether people should expect to have their facial images collected and stored for national security reasons? Mackey said that technology is evolving at such a pace that the public is placed in the position of needing to put the proverbial genie back into its bottle if a given technology oversteps public expectations.

He said: “Law enforcement in the US often finds itself in a position to benefit from new technologies before jurisprudence weighs in on the applicability of the technology in light of existing legislation. Companies marketing their technologies also tend to focus on how their solution will help solve the worst of the ‘bad guy’ cases and in so doing ignore what legal opinions might exist on their implementations.

“The net result of this situation is that both law enforcement and vendors operate outside of public awareness. When you consider that many technology selection decisions are made at the law enforcement level and not with input and review from local, state or federal elected officials, it becomes unavoidable that novel but questionable uses for data will occur. This is a perfect example of the adage ‘given access to data, teams will find cool ways to use it, even if they shouldn’t.”

If this goes some way to helping the public know more about the reality of facial recognition software then the EFF will see it as worthwhile. Whether it causes a change in the way government departments operate will remain to be seen.

What’s Hot on Infosecurity Magazine?