IBM used Flickr photos for facial-recognition project

Could be Training Facial Recognition AI Without Your Consent

NBC: Facial Recognition's 'Dirty Little Secret' Is Using Millions of Pictures Without Consent

Using "human facial images" from public image galleries such as Flickr, American company IBM's study titled "Diversity in Faces" aimed to advance "fairness and accuracy in facial recognition technology", their website stated.

The American technology company "scraped" around one million photos of members of the public from the photo sharing website Flickr and used them to conduct research into bias in facial recognition technology.

Users were unaware that their photos had been collected for the project, and had no way of knowing that photos taken from their personal accounts were included in the data set, an investigation from NBC found.

John Smith, who oversees AI research at IBM, said that the company was committed to "protecting the privacy of individuals" and "will work with anyone who requests a URL to be removed from the dataset".

According to The Verge, the license did not permit the use of the images for facial recognition programmes to profile them by ethnicity.

The company extracted almost one million photos from a dataset of Flickr images originally compiled by Yahoo.

But while IBM was using perfectly fine Creative Commons images, the company hadn't actually informed those whose faces appear in the nearly one million images what their actual faces, not just images, were being used for.

The photos in IBM's dataset also do not include usernames or subject names, which would make it hard for people to identify people in photos.

Photographers of the people in the Flickr photos may have had permission to use the subject's images, but there was no clear line of consent that allowed their photos to be used to train IBM's smart software. "It seems a little sketchy that IBM can use these pictures without saying anything to anybody".

However, digital rights group Privacy International said IBM had been wrong to use the photos without direct consent from those pictured.

"IBM says people can opt out, but is making it impossible to do so", she tweeted.

For those eager to have their images removed from the study, IBM asks them to email links of the photos they want removed.

The only problem? The dataset isn't publicly available, only to researchers, so Flickr users and those featured in their images have no way of really knowing if they're included.

Latest News