DIKW Blog Post

Cathy O’Neil outlines in “Weapons of Math Destruction” forms of data processing that is intended to replace and aid in human decision making. These programs try to interpret data and produce outcomes as similarly as an actual person would.

One of the programs, IMPACT, which measures teachers’ efficiency in the class room. O’Neil describes the process of DIKW for this particular system. The data was represented by the students’ math and language test scores. The information was how well the students were interpreted to have done based on the test scores. Knowledge in this case was determining performance of teachers based on student performance. Finally, wisdom became the ultimate decision of whether or not to fire an under performing educator based on the results. This model, as many teachers and O’Neil pointed out, does not accurately reflect the capabilities in other aspects of the teachers. For one it does nothing to show the characteristics of the teachers and how well they communicate and connect with their students. It also does nothing to account for the students whose under performance may be due to external factors and have nothing to do with their education at all. Even the data to begin with could have been inaccurate with teachers boosting student’s test scores in order to make themselves look better.

Another instance of a machine making human decisions is the Kronos personality test for job applicants. Data such as location, age, race, gender, and mental health is collected from resumes and personality tests. The information is then processed and categorizes people based on their responses. this information is then giving the computers the knowledge, based on the criteria in the system, who would be good candidates of employment. The computer uses a form of wisdom then when deciding which prospects to send through the system to the hiring executives who decide from there who gets an interview and who gets hired. The flaws in this system are endless, but one that stands out is the computer attempting to make decisions some people don’t even get right. Businesses looking to hire themselves have their own biases and prejudices, but when they simply throw those into a computer system and allow it to weed those undesirables out, even more people are excluded. Using systems like this only worsen an already unfair job market and make it easier to discriminate based on superficial attributes. Computers shouldn’t be given the opportunity to interpret their own wisdom and have that affect the lives of real people.

Leave a Reply

Your email address will not be published. Required fields are marked *