AI, Ethics, and Geoethics (CS 5970)
Module 7: Ineligible to Serve
Summary
- (30 min) Read the chapter
- (15-20 min) Additional readings
- (30 min) Discussion and case-study on slack
- (5 min) grading declaration
Readings
Read Chapter 6 “Ineligible to Serve: Getting a Job” in the Weapons of Math Destruction book. As with our last chapter in this book, we will do a discussion as well as a case study.
One of the key algorithms discussed in this chapter is still very much in active use. As some additional reading (and some disturbingly cheery videos), explore the website for the software discussed in the book “Workforce Ready Suite“.
Discussion and Assignment
This discussion will happen in the #weapons-of-math-destruction channel. Remember to use threads so that we can keep track of the conversation more easily.
We are going to do two things for this chapter. The first is our more traditional discussion based on the chapter and quotes from the chapter. The second is that I want you to do some research about predictive policing.
Discussion
As with previous chapters, I have a few quotes to think about and discuss. As before, feel free to find your own quotes and discuss as well.
- “Companies like Kronos brought science into corporate human resources in part to make the process fairer.” (page 91, Chapter 6, Weapons of Math Destruction)
- “‘The primary purpose of the test,’ says Roland Behm, ‘is not to find the best employee. It’s to exclude as many people as possible as cheaply as possible.'” (page 92, Chapter 6, Weapons of Math Destruction)
- “Our livelihoods increasingly depend on our ability to make our case to machines.” (page 96, Chapter 6, Weapons of Math Destruction)
- “The key is to learn what the machines are looking for. But here too, in a digital universe touted to be fair, scientific, and democratic, the insiders find a way to gain a crucial edge.” (page 96, Chapter 6, Weapons of Math Destruction)
- “… we’ve seen time and again that mathematical models can sift through data to locate people who are likely to face great challenges, whether from crime, poverty, or education. It’s up to society whether to use that intelligence to reject and punish them – or to reach out to them with the resources they need. We can use the scale and efficiency that make WMDs so pernicious in order to help people. It all depends on the objectives we choose.” (page 98, Chapter 6, Weapons of Math Destruction)
Short Research Assignment/Case study
I really wanted to make this case study about the ways in which you could identify bias issues in automated screening algorithms but that is one of the projects! Instead, I want you to look very specifically at the ways in which such systems have (and will) affect you. How many times in your life has your resume been handled by an automated screener? Were you rejected or accepted by it? Did you get any advice on how to fix anything if you were rejected? How will this affect your future as you move forward in your career? What have you learned about the process that you can use to fix the process and help others? Share your thoughts in #case-studies.
Declarations
OU students: After you have done your reading and engaged actively in discussion, complete the grading declaration titled “Module 7: Ineligible to Serve”