Certain crimes, like tax evasion or money laundering, aren’t the kinds of crimes that predictive policing software is looking out for. So, the software—like the human police who use it—is biased toward nuisance crimes committed in low-income or minority neighborhoods. Thus, predictive policing software is, at its core, biased against people of color and working-class people—even though it claims to be working against racism and classism.