0%
0 / 3 answered

Computing Bias Practice Test

3 Questions
Question
1 / 3
Q1

Introduction: A school district considers using facial recognition to identify visitors at building entrances. Examples of Bias: Computing bias means a system produces unfair results for some people, often because it reflects human choices or unequal social conditions. In facial recognition, bias appears when the tool misidentifies people with darker skin tones or women more often than others, especially if the training photos include fewer examples from those groups. How Bias Emerges: Bias can emerge when a data set overrepresents one group, when labels include stereotypes, or when designers test the tool mostly on one population. Impacts: Misidentification can lead to students or parents being wrongly questioned, denied entry, or reported to security, which can increase stress and distrust. Mitigation Strategies: The district can test accuracy across demographic groups, use more representative data, add human review before action, and set clear limits on when the tool is used.

Based on the text, which strategy is used to mitigate computing bias?

Question Navigator