Lecture: The danger of big data and its usage as a ‘weapon of math destruction’
September 11, 2018
Data scientist Cathy O’Neil discussed the dangers of data usage in her lecture, “The Dark Side of Big Data,” in the Great Hall on Tuesday.
O’Neil graduated from the University of California, Berkeley and then received her doctorate in mathematics from Harvard in 1999. She was a postdoc at MIT and also served as a professor at Barnard College.
What led her away from working in the finance field, however, was how financial institutions lie about data and password protection, O’Neil said.
After explaining how these flawed algorithms, or as O’Neil calls them “weapons of math destruction,” are often widespread, mysterious and destructive, she gave a few examples she found during her work as a data scientist.
She first discussed a scoring system used to determine if a teacher should be fired or given a raise. While some argue for a scoring system for teachers is used to weed out the bad ones, this study was done wrong statistically, O’Neil said.
The problem with the system was that the study was done using flawed statistics. The statistics used were from only about 20 students; and those students’ performances compared to the previous year decided the fate of their teacher after only a year, O’Neil said.
“We’re not getting rid of the worst teachers and keeping the best teachers, actually the best teachers are leaving,” O’Neil said.
O’Neil also told a story about a young man named Kyle, who was forced to take a personality test to get a job at a local Kroger’s. He was a straight A student in high school who was attending Vanderbilt, but he failed the test.
When Kyle realized the test was similar to a test he took at a hospital regarding his mental health, his father was able to form a case because companies are not allowed to require mental health assessments to get a job under the Americans with Disabilities Act, O’Neil said.
O’Neil said this is a weapon of “math destruction” because it is widespread, important, secret and destructive. But this issue doesn’t affect just Kyle, it impacts millions of people looking for jobs, O’Neil said.
O’Neil also talked about an algorithm that can point officers in areas to patrol that have high crime. But she said we don’t entirely know where crimes happen, only where arrests do.
This creates “uneven policing,” and assumes that crime only happens in certain areas, O’Neil said.
In some places in the United States, jail sentences may also be determined by an algorithm. O’Neil described a questionnaire that is ran through an algorithm after being completed and spits out a sentence.
O’Neil said that this is unconstitutional as some of the questions are about parental criminal history or mental health.
“You’re not just predicting the future, you’re describing the future,” O’Neil said.
There is also a need for a regulation for algorithms: that there needs to be something that audits these tests that have “secret” point systems, O’Neil said. Especially ones that can are of high impact, such as coming up with jail sentences and job eligibility.
Generally, when a company is asked if an algorithm “works”, their definition of working is going to be different than the customer’s. Corporations are “working” if they are turning a profit, O’Neil said.
O’Neil was asked what the average person should do about this problem.
She answered that this isn’t something people can do to protect themselves because these problems affect people without the power to abstain from the system. These are people in jail, looking for jobs and generally on the lower end of the socio-economic spectrum. O’Neil said there is a need for a movement to create change.