I’m really mad at wordpress right now, because they have a “save draft” feature. After nearly completing my post, I pressed that button assuming that imprint would be saved. I wrote a little more to finish the post, and then started fiddling with the works cited. And then, a simple mistype – I’ve had this happen before on wordpress – and the whole document disappeared and somehow was replaced with the letter r. This is particularly frustrating because I spent an inordinate amount of time on this week’s blog and really struggled with what I wanted to say. And now I begin, again.
So be prepared for what follows, a pitiful paraphrase – fueled by stress eating fun sized snickers bars – of a previously well written attempt to discuss the research on bias in algorithms and what can be done about it.
Safiya Noble is a PhD’d research who has become most noted for her publications on potential bias found within search engine algorithms. In her book, Algorithms of Opression, Noble launches off the work done by the United Nations to show the implicit bias in autofill suggestions of women…cannot/should not/need to, which at the time of the UN study brought up many sexist and limiting ideas. If one thing can be known from this research, is that research can illicit change; between the initial research and the undoubted attention Noble brought to the topic, this issue has been corrected on Google and there are *no* autofill options that populate when you now type this in.
Sharon Block’s article in Digital Humanities Quarterly posits a similar claim that JSTOR, the online “journal storage” database used by libraries for all types of academic research, has similar biases when it comes to the autoselected topics that it assumes each article is about. Block evaluated several women’s history authors to find that “women” was not a search topic for any of their articles and instead “men” was associated as the most dominant topic. That certainly sounds off, doesn’t it?
These authors and others have obviously identified problems within algorithms, and they draw nefarious conclusions about systems that are supposed to be neutral and separate from bias. In fact, I believe that is why some of these systems were built – to separate from bias. And yet, our technology, without the driving force of humanity, leads us astray. Noble defines this new term, technological redlining, as “the power of algorithms …and the way those digital decisions reinforce oppressive social relationships and enact new modes of racial profiling” (Noble 1).
Clearly these algorithms aren’t working the way they are supposed to. But are they nefarious? I also posted some stuff about The Social Dilemma, showing out developers typically believe in the good of their technology and that it is neutral. But some of these folks, like Guillaume Chaslot who helped developed the YouTube autoselect algorithm, have now come out against the algorithms they helped develop, even calling them dangerous. In fact, Chaslot has developed a new site promoting algorithm transparency, algotransparency.org, which attempts to track and more fully understand the ramifications of the algorithm’s predictions. The film highlighted time and time again that many of these tech leaders set out with the best of intentions only to realize to late the negative impact that their work was having on culture.
I still hold to an idea that algorithms are neutral; however, they exploit human nature’s dark side. Algorithms home in on what the user may be seeking, and sadly there are far too many in our world whose senses are triggered by the boundlessness of the internet. While Noble calls for change by adding more POCs to tech industry, and I don’t disagree with her perse, what we really need is a change of heart in our own culture if we truly want to see the abolishment of technological redlining and other nefarious practices like it.
References
Block, Sharon. “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s Race Histories.” Digital Humanities Quarterly 14:1. 2020.
Noble, Safiya. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press, 2018.
Orlowski, Jeff, director. The Social Dilemma. Netflix, 2020.
Jayme Kurland
October 19, 2020 — 8:55 pm
Oh Nicole, I am so sorry you had the nightmare that we all fear, and have all experienced, but I think your blog was great! It’s such a hard thing to consider whether the algorithms are “nefarious.” On one hand, after seeing such horrible search results in Noble’s work, and examples from students in the class, it is hard to not feel angry with, as Noble states, “systems of oppression.” But it is hard to know where the blame goes in some cases. I think sometimes these systems are created without nefarious intent, but also without considerations for minorities and women. I think the latter is more “error of omission.” It is certainly an error, thought, and it needs to be fixed.
Rebecca
October 20, 2020 — 12:10 am
Nicole, Sorry about the snafu. I think your point about JSTOR is spot on. We think of that as a reliable and trustworthy source in that the articles are peer-reviewed. But until reading Block’s article I never considered the way you may reach those articles through search engines and how that potentially alters your findings. My students often used JSTOR as a requirement because I did not want them just googling information, but now I’m thinking about how some of their topics may have been affected by algorithms in significant ways.
Terence V
October 20, 2020 — 12:53 am
Thank you for bringing up “The Social Dilemma” because it is SO TIMELY and apropos for this week’s topic. The algorithms and other technologies related to them has just grown out of control and present not just cultural dangers (redlining and marginalization) but existential threats to our society as well. I’m glad to see Noble’s activism has already brought about (some) good progress on the redlining front, but the even bigger problems “The Social Dilemma” calls out needs even broader action from the government and public.
Cassandra
October 20, 2020 — 1:52 am
Yes, you ask a good question, are algorithms nefarious? Noble doesn’t care about intent behind encoding she just cares about results, the bottom line. However, encoding on the internet is a relatively new field in the history of humanity and I wonder how much of this is a reflection of a new field rather than nefarious designs. As I write this, I do not excuse Google from the extremely troubling results from Noble’s simple search with the phrase “Black girls.” How does one establish regulations and guidelines to ensure that everyone is appropriately represented? And, what role can libraries take to educate the public about vetted search engines and databases that may, can and do provide better information than a Google search? Google is revenue driven, ad driven and libraries are not.
Madison Morrow
October 20, 2020 — 11:29 am
Oh no Nicole! I’m so sorry your original blog post got deleted! I click the save my draft button constantly throughout the writing process and it’s horrible to know that its all for nothing. I had a similar thing happen to me for another class where we were writing on the discussion board in Blackboard and the site decided I had been inactive too long when I tried to submit my response and it deleted it.
A constant theme I have picked up on throughout this course is that nothing is ever neutral and uninfluenced by bias and personal opinion or decisions. Unfortunately, I don’t know how we would ever move past this. As long as a person is involved with writing the algorithms these biases can and will continue to sneak in. Yes, Noble’s research led to change, but how long had this been a problem before search engines realized they couldn’t keep marking it down as a glitch in the system.
Caroline Greer
October 20, 2020 — 3:40 pm
Hey Nicole! I send my sympathies about losing your post draft. You always lose your work when it’s almost done and you’ve worked hard I feel.
You highlighted the same two works that resonated strongly with me. I wrote a lot about Noble’s work for my post because it was such an important book that I needed as a gender historian. The same with Block’s – I have often noticed that it can be difficult to find women’s and gender history works on JStor, but I never thought about it too much….
I am considering issues I never have before, as you are as well.
Janet
October 20, 2020 — 10:34 pm
I think that these episodes of inaccurate algorithmic assessments and biases demonstrate how a mixture of historical experts and digital tools should inform search capabilities. Heavy reliance on historians or computer scientists will result in an unhealthy skew. At this time, I think bringing in more historical field experts would provide a helpful correction.
Samantha
October 21, 2020 — 3:49 am
Nicole, I am sorry to hear you had trouble with WordPress. I’ve started writing my blog posts in Microsoft Word because of past WordPress glitches I’ve experienced.
I find it really interesting, and I guess it makes me feel hopefully, that the results of autofill searches changed in the time Noble brought the topic to light. Obviously, there is still a ways to go and by no means are the algorithm that are utilized perfect. However, it is always good to see research bring about some change, even if small.
I am really intrigued to watch The Social Dilemma now. I was able to find it and watched the trailer and interested to see what these developers have to say about their technology. And how it is currently being used.