0%
0 / 1 answered
Interpreting Author's Analogies Practice Test
•1 QuestionsQuestion
1 / 1
Q1
Read the passage and answer the question.
In a computer science elective, we learned that algorithms can inherit bias from the data that trains them. At first, that sounded like accusing math of having opinions, which felt absurd. Yet the examples were specific: facial recognition struggled more with certain skin tones, and hiring tools sometimes favored familiar backgrounds. The discomfort in the room was real, because the problem sounded both technical and moral.
Our instructor said a model is *like a mirror with smudges*, reflecting patterns but also distorting them. The context was training data: if the mirror has been handled carelessly, it shows some faces clearly and others poorly. The analogy helped us see that the system is not “evil,” but it is not neutral either. It carries the marks of what it has been given.
He emphasized that cleaning the mirror requires deliberate work: better data, careful testing, and accountability. That detail prevented the analogy from becoming fatalistic. Smudges are not destiny. They are evidence of contact.
We left the class thinking about responsibility in design. Technology, we realized, is shaped by human choices, even when it looks automatic. The mirror does not choose what it reflects.
In the passage, what does the analogy between a mirror with smudges and an algorithm suggest about bias?
Read the passage and answer the question.
In a computer science elective, we learned that algorithms can inherit bias from the data that trains them. At first, that sounded like accusing math of having opinions, which felt absurd. Yet the examples were specific: facial recognition struggled more with certain skin tones, and hiring tools sometimes favored familiar backgrounds. The discomfort in the room was real, because the problem sounded both technical and moral.
Our instructor said a model is *like a mirror with smudges*, reflecting patterns but also distorting them. The context was training data: if the mirror has been handled carelessly, it shows some faces clearly and others poorly. The analogy helped us see that the system is not “evil,” but it is not neutral either. It carries the marks of what it has been given.
He emphasized that cleaning the mirror requires deliberate work: better data, careful testing, and accountability. That detail prevented the analogy from becoming fatalistic. Smudges are not destiny. They are evidence of contact.
We left the class thinking about responsibility in design. Technology, we realized, is shaped by human choices, even when it looks automatic. The mirror does not choose what it reflects.
In the passage, what does the analogy between a mirror with smudges and an algorithm suggest about bias?