Across the United States and Europe, software is making probation decisions and predicting whether teens will commit crime. Opponents want more human oversight.
He didn’t realize that an algorithm had tagged him high risk until he was told about it during an interview with The New York Times.
“What do you mean?” Mr. Gates, 30, asked. “You mean to tell me I’m dealing with all this because of a computer?”
In Philadelphia, an algorithm created by a professor at the University of Pennsylvania has helped dictate the experience of probationers for at least five years.
Interesting article in the Times:
Cade Metz and Adam Satariano, NY Times
This kind of reminds me of this book:
A Philosophical Investigation, by Philip Kerr
LONDON, 2013. Serial killings have reached epidemic proportions—even with the widespread government use of DNA detection, brain-imaging, and the “punitive coma.” Beautiful, whip-smart, and driven by demons of her own, Detective Isadora “Jake” Jacowicz must stop a murderer, code-named “Wittgenstein,” who has taken it upon himself to eliminate any man who has tested positive for a tendency towards violent behavior—even if his victim has never committed a crime. He is a killer whose intellectual brilliance is matched only by his homicidal madness.