Data Company Central

 View Only

Tech’s Encoded Racism

By Steven Chung posted 03-22-2022 09:00:00 AM

  
We in digital technology need to understand how words we may take for granted
Tech’s Encoded Racism
are read as signals by those for whom all the assumptions and defaults seem stacked against.

This article was originally published on the Delphix website here August 13, 2021.

You know how you can be reading a charming murder mystery from sixty years ago when you run across words and phrases that remind you just how casually racism was embraced at that time? Now imagine someone twenty years from now working with your code base or technical documentation having that same sobering, sinking feeling.

More to the point, imagine someone to whom those words are like darts right now, someone who experiences racism literally encoded in the company’s product. If you don’t know if any of your employees who feel that way, then you really need to ask some tough questions.

We in digital technology need to understand how words we may take for granted are read as signals by those for whom all the assumptions and defaults seem stacked against. And these signals are not just read that way. They are genuine markers of where our caring ends. This matters within tech companies, but considering how pervasive technology is in our lives, it matters to everyone.

At Delphix, we think words are important enough that we held an inclusive language hackathon a year ago on Juneteenth to notice words that silently reinforce stereotypes and ancient tropes, and then to brainstorm replacements for them. 

“Master/Slave” was the most obvious one, a phrase that Python dropped in 2018 and that GitHub more recently got rid of, and now Twitter has taken similar steps. Our hackathon group came up with alternatives suitable for different situations, including “reader/writer”, “parent/child”, and “active/standby.” In fact, during the planning process, a distinguished engineer at Delphix, Matthew Ahrens, who is also the founder of the OpenZFS open source community, searched its code base and replaced every instance of “slave” with “dependent.” We have since updated 600 terms in our own code base, technical documentation, customer support pages, and the like. 

While “master/Slave” is a flagrant example of needlessly offensive words, complaints about other common tech terms such as “blacklist” and “whitelist” may strike you as frivolous. After all, we’ve been using “black” and “white” to stand for bad and good at least since Roman times (the Latin word for “black”  — “ater”  — shows up as the root of words such as “atrocious”), and the Bible associates darkness with  evil. 

But, as our Senior Director of Program Management Karyn Ritter says, “It doesn’t help to explain something to call it a ‘whitelist’.” “Calling things by their plain names actually makes it clear for everyone and doesn’t subtly reinforce offensive concepts.” Among the alternatives for blacklist: “list of removed faults.” 

By holding a hackathon, we made it clear not only that our company's commitment to eliminating bias in our workplace is real and serious, it also signalled that we understand that this can only happen by giving people a voice...and then listening to those voices. By opening up the discussion, demonstrating that it’s a safe place to voice ideas and feelings, you may be surprised to learn how seemingly innocuous words may look to those who feel under societal threat. 

The point is not to impose a new corporate manual of style that bans some words and enforces the use of unfamiliar and sometimes awkward replacements. Getting workers to conform because corporate management is watching them is likely not to lessen bias but to sharpen it with resentment. Rather, starting the conversation and taking action based on it are signals to all of our employees, and all who do business with us, that we take inclusivity seriously, and that we think it requires engagement in every part of our business. 


0 comments
4 views

Permalink