PimEyes: The Facial Recognition Search Engine

- | 1 min read

Recent investigation and reporting from the Intercept have shown major child safety concerns related to PimEyes, the facial recognition search engine.

Previously, PimEyes received backlash for including search result images connected to social media accounts and other sites not usually indexed by search engines. While they have removed those features from results, plenty of the remaining results link to potentially identifying information. In their investigation, the Intercept found PimEyes results that included pictures of children on charity sites, or as part of videos in a local news story; details that would help someone trying to locate a child – be they a predator or abusive, non-custodial parent.

More alarmingly, the Intercept reports one 16 year-old who used PimEyes to find “revenge porn” images of her that had been posted online by an ex. These images are child pornography, and yet appear in PimEyes search results (blurred out and marked as potentially explicit, unless the user pays for premium features).

While PimEyes reports to have some systems in place for monitoring and addressing harmful use of the platform, it is clear these systems are minimal, and may not function as intended.

By contrast, the child abuse prevention nonprofit Thorn has also used facial recognition technology to identify images connected to child sex trafficking and take action to address it.

What does all this mean for parents?

Parents may want to use PimEyes to search for images of their own family and see where they may appear online, and if they may be connected to identifying information. Unfortunately, in order to view the URLs where images live, you have to pay for a premium account.

But most importantly, parents need to be thoughtful about what they and their children post online. These need to become ongoing age-appropriate discussions with kids so that they understand the risks, the reasons for parental limitations, and who to go to if something goes wrong.

Read the full Intercept report: Facial Recognition Search Engine Pulls Up Potentially Explicit Photos of Kids