Data Bias Is Real: What That Means for Outcomes and Equity
David J. Cox PhD MSB BCBA-D, Ryan L. O'Donnell MS BCBA

The Glass Spheres Problem
In Minority Report, the precogs don’t make arrests.
They don’t issue warrants.
They don’t decide who is guilty.
They generate visions.
The system decides which visions get attention, how much weight they carry, and what happens next.
AI systems work the same way. They’re not humans (recall issue #002 on how AI is Math, Not Magic). They can’t have intentions to discriminate, exclude, or harm. They surface patterns based on the data they were given, the labels humans applied, and the thresholds a person chose; decisions that took place long before the tool ever reached a clinic, school, or agency.
The uncomfortable truth is this: Bias doesn’t require bad actors. It only requires unexamined data.
Chiron: The AI Literacy Series for ABA Professionals
A weekly newsletter exploring how ABA professionals can develop essential AI literacy skills to ensure ethical and effective practice in a rapidly changing field.