The FTC Forced a Misbehaving A.I. Company to Delete Its Algorithm

Dave Gershgorn

OneZero

Jan 19, 2021

In 2019, an investigation by NBC News revealed that the photo storage app Ever had used billions of its users' photos to train facial recognition algorithms without their permission. Ever then sold those algorithms to law enforcement agencies and the U.S. military who wanted the technology to improve their own facial recognition technology. This week, the Federal Trade Commission decision is forcing Paravision, the parent company of Ever, to not only delete the photos Ever had stolen from its users, but to delete the algorithms as well. This ruling sets a precedent in battles over artificial intelligence. The decision is a costly one, too, since deleting the algorithms means that Ever can't make good on its contracts with law enforcement agencies and the military. Chalk this one up to a win for enforcing privacy regulations in AI.

Join us.

To stay informed about the ways in which AI is affecting you and your community, sign up for our newsletter. Now is the time to keep updated on AI in the interest of our communities.

Community Partners

AI4All
Black in AI
Data Science 4 Everyone
Hispanic Federation
Latinx in AI
Lulac
NAMIC
National Urgan League
Queer in AI
Women in Machine Learning
Women in AI
World Economic Forum

Supporters

AirBNB
Amazon
Chan Zuckerberg Initiative
Meta

Microsoft

We are proud to be sponsored by some of the world's leaders in AI and AI-related fields. These organizations are drawing the maps for an unknown world. By recognizing the need to engage communities of color, these partners are ensuring a more equitable AI future for everyone.

Become a Sponsor