Vaccine Prioritization, Artificial Intelligence, and Communities of Color: What You Need to Know

By Susan Gonzales and Zachary Solomon

Last week, Stanford Medicine issued an apology for releasing a vaccine prioritization list that included only seven medical residents and fellows. Stanford Medicine has more than 1,300 residents on staff, many of whom are routinely asked to treat Covid-19 patients.

What happened? In short, artificial intelligence happened. Stanford used an algorithm to determine which of their employees would receive a vaccine from their first shipment of 5,000 doses. According to NPR, the algorithm was meant to “ensure equity and justice” by prioritizing Stanford’s health care workers who are at the highest risk for Covid-19 infections. Other factors the algorithm considered were age and location of where employees were physically located in the hospital. Since residents tended to be young and without an assigned location, they dropped on the AI-created priority list.

One of the residents who protested the oversight said a flawed algorithm is no excuse.

At AIandYou, we routinely see the dangers inherent to AI-assisted decision making. We recognize that unchecked and unapproved artificial intelligence can have adverse effects on the communities who need the most help. AI development needs to be more inclusive of different perspectives, especially during a global pandemic in which there is such a wide spectrum of how Covid-19 is affecting communities. Our hope is that organizations like Stanford Medicine will be more creative and thorough in seeking the perspectives of people of color when developing AI.

So when will communities of color receive the vaccine?

States like California are under pressure to distribute vaccines to vulnerable communities first. Indeed, those communities — ones often consisting of laborers, farmers, undocumented workers, and other frontline, essential workers — have been disproportionately affected by Covid-19. States are increasingly releasing their vaccine plans, some of which include emphases on equitable vaccine distribution for people of color.

As of now, it remains to be seen how the vaccine will continue to be allocated through the United States. What we do know is that some states, including Ohio, Wisconsin, South Carolina, and Arizona, are using algorithms to determine vaccine distribution. Ohio’s plan, for instance, considers factors such as current case count, level of natural immunity, “social vulnerability,” and “health equity.”

We should expect that errors will continue to be found in these algorithms. And though these errors will eventually be corrected, some lasting damage may already have occurred. Be sure to ask your insurance provider, employer, and local elected officials about distribution of the vaccine to your community. Now is the time to ask questions and be informed.

Join us.

To stay informed about the ways in which AI and new technologies is affecting you and your community, sign up for our newsletter. Now is the time to keep updated on AI and new technologies in the interest of our communities.


World Economic Forum
National Urgan League
Hispanic Federation
National Fair Housing Alliance
Black in AI
Queer in AI
Latinx in AI
Women in AI
Women in Machine Learning


Chan Zuckerberg Initiative

We are proud to be sponsored by some of the world's leaders in AI and AI-related fields. These organizations are drawing the maps for an unknown world. By recognizing the need to engage communities of color, these partners are ensuring a more equitable AI future for everyone.

Become a Sponsor