The FTC Forced a Misbehaving A.I. Company to Delete Its Algorithm

Dave Gershgorn

OneZero

Jan 19, 2021

In 2019, an investigation by NBC News revealed that the photo storage app Ever had used billions of its users' photos to train facial recognition algorithms without their permission. Ever then sold those algorithms to law enforcement agencies and the U.S. military who wanted the technology to improve their own facial recognition technology. This week, the Federal Trade Commission decision is forcing Paravision, the parent company of Ever, to not only delete the photos Ever had stolen from its users, but to delete the algorithms as well. This ruling sets a precedent in battles over artificial intelligence. The decision is a costly one, too, since deleting the algorithms means that Ever can't make good on its contracts with law enforcement agencies and the military. Chalk this one up to a win for enforcing privacy regulations in AI.

Join us.

To stay informed about the ways in which AI and new technologies is affecting you and your community, sign up for our newsletter. Now is the time to keep updated on AI and new technologies in the interest of our communities.

Partners

World Economic Forum
National Urgan League
Hispanic Federation
NAMIC
National Fair Housing Alliance
Black in AI
Queer in AI
Latinx in AI
Women in AI
Women in Machine Learning

Supporters

Amazon
Meta
Chan Zuckerberg Initiative
Microsoft
AirBNB

We are proud to be sponsored by some of the world's leaders in AI and AI-related fields. These organizations are drawing the maps for an unknown world. By recognizing the need to engage communities of color, these partners are ensuring a more equitable AI future for everyone.

Become a Sponsor