art historian, environmentalist, educator

Technological Redlining, no 2.

I’m really mad at wordpress right now, because they have a “save draft” feature.  After nearly completing my post, I pressed that button assuming that imprint would be saved.  I wrote a little more to finish the post, and then started fiddling with the works cited.  And then, a simple mistype – I’ve had this happen before on wordpress – and the whole document disappeared and somehow was replaced with the letter r.  This is particularly frustrating because I spent an inordinate amount of time on this week’s blog and really struggled with what I wanted to say.  And now I begin, again.

So be prepared for what follows, a pitiful paraphrase – fueled by stress eating fun sized snickers bars – of a previously well written attempt to discuss the research on bias in algorithms and what can be done about it.  

Safiya Noble is a PhD’d research who has become most noted for her publications on potential bias found within search engine algorithms.  In her book, Algorithms of Opression, Noble launches off the work done by the United Nations to show the implicit bias in autofill suggestions of women…cannot/should not/need to, which at the time of the UN study brought up many sexist and limiting ideas.  If one thing can be known from this research, is that research can illicit change; between the initial research and the undoubted attention Noble brought to the topic, this issue has been corrected on Google and there are *no* autofill options that populate when you now type this in.

Sharon Block’s article in Digital Humanities Quarterly posits a similar claim that JSTOR, the online “journal storage” database used by libraries for all types of academic research, has similar biases when it comes to the autoselected topics that it assumes each article is about.  Block evaluated several women’s history authors to find that “women” was not a search topic for any of their articles and instead “men” was associated as the most dominant topic.  That certainly sounds off, doesn’t it?

These authors and others have obviously identified problems within algorithms, and they draw nefarious conclusions about systems that are supposed to be neutral and separate from bias.  In fact, I believe that is why some of these systems were built – to separate from bias.  And yet, our technology, without the driving force of humanity, leads us astray.  Noble defines this new term, technological redlining, as “the power of algorithms …and the way those digital decisions reinforce oppressive social relationships and enact new modes of racial profiling” (Noble 1).

Clearly these algorithms aren’t working the way they are supposed to.  But are they nefarious?  I also posted some stuff about The Social Dilemma, showing out developers typically believe in the good of their technology and that it is neutral.  But some of these folks, like Guillaume Chaslot who helped developed the YouTube autoselect algorithm, have now come out against the algorithms they helped develop, even calling them dangerous.  In fact, Chaslot has developed a new site promoting algorithm transparency, algotransparency.org, which attempts to track and more fully understand the ramifications of the algorithm’s predictions.  The film highlighted time and time again that many of these tech leaders set out with the best of intentions only to realize to late the negative impact that their work was having on culture.  

I still hold to an idea that algorithms are neutral; however, they exploit human nature’s dark side.  Algorithms home in on what the user may be seeking, and sadly there are far too many in our world whose senses are triggered by the boundlessness of the internet.  While Noble calls for change by adding more POCs to tech industry, and I don’t disagree with her perse, what we really need is a change of heart in our own culture if we truly want to see the abolishment of technological redlining and other nefarious practices like it.

References

Block, Sharon.  “Erasure, Misrepresentation and Confusion:  Investigating JSTOR Topics on Women’s Race Histories.”  Digital Humanities Quarterly 14:1.  2020.

Noble, Safiya.  Algorithms of Oppression:  How Search Engines Reinforce Racism.  NYU Press, 2018.

Orlowski, Jeff, director. The Social Dilemma. Netflix, 2020.

« »
css.php Skip to content