Photo App Develops Tool with User Images

Written by

Photo storage app Ever failed to get consent from users who uploaded millions of images to the service before it adopted the images as tools to train a commercial facial recognition system, according to NBC News.

Without disclosing their use of the images to users of the app, Ever also reportedly offered to sell that facial recognition technology to private companies, law enforcement and the military.

“What began in 2013 as another cloud storage app has pivoted toward a far more lucrative business known as Ever AI – without telling the app’s millions of users,” wrote NBC News.

The story has raised a lot of privacy questions around whether photo-based apps should be using photo data submitted by end users to train facial recognition systems without end user consent, even in cases where identifying information about users isn’t shared.  

“Users need to consent to be training data. Faces are one of the most personal things we have, and faces can have legitimate reasons for not wanting to be included in surveillance training systems,” said Miju Han, director of product management at HackerOne.

“That doesn’t even touch the representation problems that can come from unevenly distributed demographics. Apple did the ‘right thing’ in developing faceID – they paid a representative sample of users who consented to being a part of the program to develop a successful algorithm in a privacy-forward way.”

At issue is what the users understand and fail to understand about what happens to the data they choose to upload on any service. “Unless they go through the terms of service in detail, they do not have guarantees about how their data is used. For example, people need to think deeply about whether or not they would want to upload their genetic data to test it for additional markers. Even if there are protective terms, a data breach could put their genetic code in the hands of anyone interested in it,” Han said.

What’s hot on Infosecurity Magazine?