AI, Ethics, and Geoethics (CS 5970)

Module 6: Engineered Inequity: Are Robots Racist?


  • (45-60 min) Read the chapter
  • (15 min) Read the related article
  • (15-30 min) Discussion on slack
  • (5 min) grading declaration


First, read Chapter 1 “Engineered Inequity: Are Robots Racist?” in the Race After Technology book.  While you read, pick out some favorite quotes to discuss.  There is a lot of great material in this chapter! 

After you finish the chapter, read “You will own slaves” in the 1957 Mechanix Illustrated.  This is a short but horrifying article that is mentioned in Benjamin’s chapter (and is the source of the illustration on the right, which also shows up in her book).  It is short and illustrates both some horrifying ideas about racism and terminology of slavery as well as just some really far-fetched ideas of what robots would be able to do.  


Robotic “Slaves”

Mechanix magazine illustration

 Image from “You will own slaves” by Mechanix Illustrated 


This discussion will happen in the #race-after-technology channel. Note, we are going to discuss both the chapter and the article in the channel. Remember to use threads so that we can keep track of the conversation more easily.

I put some quotes from the chapter below to get you thinking about what you learned from this chapter.  Some of these quotes are provocative and would make good discussion material!  Please share your thoughts about what you learned, pick a favorite quote (or several, as I did!) from the chapter and let’s discuss.  For example, given the quote below about detachment, how do we improve the education of the people creating this technology such that detachment isn’t what is happening?  Or, given the very relevant quote about people being only numbers, how do we avoid this?  Especially at large companies.

My quotes are only on the chapter but there is plenty of material (in a rather different vein) to discuss from the short magazine article.

  • “The idea that you could come up with a culturally neutral, racially neutral conception of beauty is simply mind-boggling.” (page 50, Chapter 1, Race After Technology)
  • “Racist robots, as I invoke them here, represent a much broader process: social bias embedded in technical artifacts, the allure of objectivity without public accountability.” (page 52, Chapter 1, Race After Technology)
  • “To the extent that machine learning relies on large, `naturally occurring’ datasets that are rife with racial (and economic and gendered) biases, the raw data that robots are using to learn and make decisions about the world reflect deeply ingrained cultural prejudices and structural hierarchies.” (page 57, Chapter 1, Race After Technology)
  • “Detachment in the face of this history ensures its ongoing codification.” (page 59, Chapter 1, Race After Technology)
  • “…by focusing mainly on individuals’ identities and overlooking the norms and structures of the tech industry, many diversity initiative offer little more than cosmetic change, demographic percentages on a company pie chart, concealing rather than undoing the racist status quo.” (page 61, Chapter 1, Race After Technology)
  • “Instead, so much of what is routine, reasonable, intuitive, and codified reproduces unjust social arrangements, without ever burning a cross to shine light on the problem.” (page 61, Chapter 1, Race After Technology)
  • “‘We are in the uncomfortable birthing stage of artificial intelligence.’ Zeros and ones, if we are not careful, could deepen the divides between haves and have-nots, between the deserving and the undeserving – rusty value judgements embedded in shiny new systems.” (page 62, Chapter 1, Race After Technology)
  • “…automation is often presented as a solution to human bias – a way to avoid the pitfalls of prejudicial thinking by making decisions on the basis of objective calculations and scores.” (page 64, Chapter 1, Race After Technology)
  • “…the conflation of economic productivity and upright citizenship is ubiquitous across many societies.” (page 73, Chapter 1, Race After Technology)
  • “Does this mean that every form of technological prediction or personalization has racist effects? Not necessarily.  It means that, whenever we hear the promises of tech being extolled, our antenna should pop up to the question what all that hype of ‘better, fast, fairer’ might be hiding and making us ignore.  And, when bias and inequity come to light, ‘lack of intention’ to harm is not a viable alibi.  One cannot reap the reward when things go right but downplay responsibility when they go wrong.” (page 75,   Chapter 1, Race After Technology)


  • OU students: After you have done your reading and engaged actively in discussion, complete the grading declaration titled “Module 6: Engineered Inequality”