BY CONSUMER REPORTS
Three short films look at how biases in algorithms and data sets result in unfair practices for communities of color, often without their knowledge. Directed by filmmaker Alice Gu.

MEDICAL DEVICES

As COVID-19 and other respiratory illnesses sweep the U.S., healthcare professionals rely on pulse oximeters to measure blood oxygen levels. But the devices are less accurate for patients with darker skin. What happens when a gold standard in diagnostics doesn’t work as well for some groups as it does for others?

MORTGAGE LENDING

Historically, lenders have considered non-white neighborhoods to be at high risk for default, and unfair redlining practices have prevented generations of people of color from accumulating the wealth typically facilitated by homeownership. Many years later, computer algorithms may continue to perpetuate these biased practices.

FACIAL RECOGNITION

This form of artificial intelligence, which detects physical features to identify individuals, is no longer the stuff of science fiction. In today’s world, this groundbreaking and controversial technology not only unlocks smartphones but also helps corporations and governments in the surveillance of citizens.

PETITION

BAD INPUT: Urge companies to stop algorithmic bias. Read the full petition.
By submitting this form, I agree to the terms of Consumer Reports' Privacy Policy and User Agreement.
Think BAD INPUT could start some good discussions? Download Conversation Prompts here.

Films produced by Logan Industry