Monday, July 3, 2017

Biased data teaches algorithms how to discriminate

Math is a tool that doesn’t discriminate. There’s no bias in it; the numbers either add up or they don’t. Algorithms depend on math, but they’re data driven — sometimes the information being fed into one is incorrect or doesn’t represent the actual goals of the algorithm. Cathy O’Neil, the author of Weapons of Math Destruction, cautions us against trusting the data being fed into our judicial systems: In the video she explains how mistakes made by algorithms cause existing problems to become even worse: And what ProPublica found was the compass model, which is one version of a recidivism…

This story continues at The Next Web

Or just read more coverage about: LG

No comments:

Post a Comment