http://www.zdnet.com/porn-companies-adopt-facial-recognition-technology-encourage-instagram-photos-7000007631/
That's already pretty creepy when you consider the potential for stalkers to go digging up dirt on people.
But if I was a really nasty piece of work, here's what I'd do: get a big archive of porn, get an archive of photos connected to RL identities (Facebook or LinkedIn would work well for this) and run an automated search for matches. Hey presto, automated industrial-scale blackmail!
Ugh.
Two porn companies are courting web surfers to upload photos they find online to the companies' free facial-recognition, face-matching database services. With SexFaceFinder.com and Naughty America's "Face" anyone can upload an image and have the services match it with images and faces in image databases.
SexFaceFinder positions its service as a way for users to find a performer that looks like s specific person. Or to find performers that look like the user's favorite type of model, in an effort to engage the user with a service that closes the marketing gap between a user and their fantasy.
Another company, Naughty America, openly solicits users to upload images of girls found on Instagram and other internet destinations in an effort to find the photo's subjects in porn - or find celebrity look-alikes, girlfriend and ex-girlfriend look-alikes, or similar/specific porn performers.
That's already pretty creepy when you consider the potential for stalkers to go digging up dirt on people.
But if I was a really nasty piece of work, here's what I'd do: get a big archive of porn, get an archive of photos connected to RL identities (Facebook or LinkedIn would work well for this) and run an automated search for matches. Hey presto, automated industrial-scale blackmail!
Ugh.
no subject
Date: 2012-11-21 07:22 am (UTC)