More misogynist AI: facial recognition reveals porn star past

clearview AI facial recognition porn
Image credit: metamorworks | Shutterstock.com

Someone in Germany is claiming that he’s successfully come up with a facial-recognition algorithm that can cross-reference women on social media with porn actresses on sites like Pornhub.

According to a post on Chinese social networking platform Weibo, the unnamed user and some programming friends used the algorithm to detect faces in porn videos and photos by using public non-porn photos taken from social media sites. The stated objective: “to help others check whether their girlfriends ever acted in those films”, although the user in question later walked that back, saying the intention was to help women find out if someone had uploaded videos of them on porn sites.

The user claims that his algorithm successfully identified over 100,000 adult video actresses around the world. The user has yet to post any evidence of this, but he reportedly told Motherboard he and his team will release “database schema” and “technical details” next week.

However, regardless of whether his specific facial-recognition algorithm works as described, the point is that it’s certainly possible to do. The very fact that it occurred to him to develop this kind of app has generated plenty of outrage, particularly among feminists and AI researchers who see this as the latest example of men using AI-based technologies in terrifyingly misogynistic ways.

For example, Disruptive.Asia has written before about unethical AI developers posting easy-to-use tools online to create “deepfake” porn videos in which someone could convincingly replace the face of actress in an adult video with the face of (say) Scarlett Johansson or an ex-girlfriend for the purposes of humiliation.

Ironically, Pornhub itself introduced machine learning and face-recognition features to its site in 2017, ostensibly for the purpose of making it easier for users to find videos with specific actresses. But the same technology could also enable online trolls to stalk and harass amateur porn models who post their videos to Pornhub, or women whose boyfriend or husband secretly made sex tapes of them and posted them online without their knowledge.

All of this highlights the need for AI ethics standards such as the principles revealed this week by the OECD. But interestingly, there’s another regulation that could help here: GDPR.

The GDPR – which just turned a year old last week – contains specific provisions governing the collection of data for AI-based decisions, especially automated decision making and profiling. That would include facial recognition apps like the one posted on Weibo.

This very discussion came up last year at Informa’s IFSEC industry conference, where one exhibition booth demonstrated facial recognition software by filming attendees as they walked past and displaying their faces on a screen with estimations of their age, sex and emotional state (happy, sad, irritated, etc). Some delegates expressed concern that the demo might contravene GDPR. Informa investigated and ultimately determined the company in question had handled the data properly. However, Informa said in a blog post this week that it has been taking steps for IFSEC Global 2019 next month to ensure all exhibitor demos are GDPR-compliant.

So, theoretically, GDPR (or a local equivalent) could also be applied to the “find out if your fiancé is on Pornhub” app. Then again, perhaps the developer (or someone else who decides to develop their own version of the app) might be the sort of person who doesn’t give a flying flip about GDPR compliance, or AI ethics in general.

In any case, it’s a worrying sign that some men are actively thinking up ways to apply AI to either humiliate women or police their sexual morality. And as both AI and facial-recognition technology become cheaper and easier to use, the more likely we’ll see more apps like this.

Then again, facial recognition in general is starting to alarm some people regardless of the app – San Francisco went as far as to ban the technology over fears of potential abuse in the name of mass surveillance. Other municipalities might follow suit (apart from New York).

Still, you can’t un-invent technology, and for all the positive benefits of facial recognition, the negative applications are just too appealing to certain governments and misogynists alike. So expect more of this.

UPDATE [4 June, 5:00pm]: The programmer has reportedly apologized for the project and says he has deleted all data and files related to it – possibly because he realized the potential harm to women, possibly because several lawyers have confirmed the project as described would indeed be very much illegal under GDPR.

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.