Controversial search engine being used to identify dead and Russian operatives
The Ukrainian government is using facial recognition technology from startup Clearview AI to help them identify the dead, reveal Russian assailants, and combat misinformation from the Russian government and its allies.
Reuters reported yesterday that the country’s Ministry of Defense began using Clearview’s search engine for faces over the weekend.
The vendor offered free access to the search engine, which Ukraine is using for such tasks as identifying people of interest at checkpoints and identifying people killed during Russia’s invasion, the news organization wrote, citing Lee Wolosky, who currently advises Clearview and formerly worked as a US diplomat under Presidents Barack Obama and Joe Biden.
The newswire reported the company was one of a number of US-based artificial intelligence companies offering its aid in the wake of Russia’s invasion, which began February 24. Speaking to both Wolosky and Clearview AI CEO Hoan Ton-That, the news site said the CEO had sent a letter to officials in Kyiv.
In the letter, seen by Reuters, Ton-That said his five-year-old company had more than 2 billion images scraped from Russian social media service VKontakte – whose design is visually very similar to that of Facebook – among its database of more than 10 billion photos.
Through the database, the CEO claimed, Ukrainian officials can identify the dead more easily than with fingerprints or dental records, even if there is damage to the face.
Ukraine can also use the technology to reunite refugees that have fled the country and reunite them with families, identify Russian operatives in Ukraine, and help the government push back against false social media posts about the war, he went on to claim.
- Clearview’s selfie-scraping AI facial recognition technology set to be patented
- Facial recog firm Clearview hit with complaints in France, Austria, Italy, Greece and the UK
- Google, YouTube, Twitter tell face-rec upstart Clearview to stop harvesting people’s content – that’s their job
- Image-rec startup for cops, Feds can probably identify you from 3 billion pics it’s scraped from Facebook, YouTube etc
What the defense ministry is using the technology for is unclear, Ton-That said, adding that other organizations within Ukraine’s government likely will begin deploying Clearview’s facial recognition tools in the near future.
Wolosky told Reuters that the VKontakte images give Clearview a more comprehensive dataset than such competitors as PimEyes, a publicly available image search engine. Clearview said it had not made its technology available to Russia.
The company has lately been seeking large governmental contracts.
Ton-That emphasized that the controversial technology shouldn’t be regarded as the sole source of identification or used in ways that violate the Geneva Conventions, which hold legal standards for humanitarian treatment during war. Those in Ukraine using the technology are receiving training and have to put in a case number and a reason for the search before queries can made, he said.
New York City-based Clearview was founded by Ton-That and Richard Schwartz in 2017, and The Register has been covering it for years, although it became even more prominent in November 2021 when The New York Times reported on the company with the headline “The Secretive Company that That Might End Privacy as We Know It.”
The company’s mission statement in part reads that that its goal “is to deliver the most comprehensive image-search solutions in the world. We support law enforcement and national organizations in their mission to identify victims and perpetrators in order to safeguard their communities and secure industry and commerce.”
It has drawn the ire of privacy and security groups as well as companies like Meta (nee Facebook), Google, Meta and even Venmo, which are demanding that Clearview stop using “their” data.
In a blog post last month applauding such lawsuits, the Electronic Frontier Foundation (EFF) wrote that supports efforts to ban facial surveillance, calling it “a growing menace to racial justice, privacy, free speech, and information security. … One of the worst offenders is Clearview AI, which extracts faceprints from billions of people without their consent and uses these faceprints to help police identify suspects. For example, police in Miami worked with Clearview to identify participants in a Black-led protest against police violence.”
The EFF says laws need to require organizations to get opt-in consent from a person before taking their face prints.
Another group, the Surveillance Technology Oversight Project (STOP), a year ago wrote that it opposed Clearview’s plans to grow its database to more than 100 billion images, saying the effort was “dystopian, disturbing, and must be stopped.”
Speaking to Reuters, STOP Executive Director Albert Fox Cahn said that Clearview’s technology could misidentify people at checkpoints and during battle and that a mismatch could lead to civilian deaths.
“We’re going to see well-intentioned technology backfiring and harming the very people it’s supposed to help,” Fox Cahn said, adding that while identifying the dead is likely the least dangerous way to use the facial recognition tools, “once you introduce these systems and the associated databases to a war zone, you have no control over how it will be used and misused.”
Despite the controversies, Clearview in October 2021 said it had raised $30m – bringing the total funding for the company to $38.6m – and two months later it was reported that the firm was attempting to patent its technology in the US.