“The public generally trusts information found in search engines” (38).
To fully grapple with and potentially disentangle my thoughts from all of the readings this week I would have to write a longer essay. To save my classmates (and maybe myself?) from the endeavor, I am primarily focused on the monograph assigned this week. Noble’s message stuck with me every page of the book, and its argument is so important for scholars in today’s world. We often think about gaps, silences, and bias in the archives, and there is a move to consider this in digital collections and research. But now we must grapple with discrimination in essentially every facet of Internet use. Yikes.
The argument, however, is not comfortable. Safiya Umoja Noble’s book Algorithms of Oppression: How Search Engines Reinforce Racism refuses to soften arguments or shy away from hard topics. Her introduction chapter states that a Google search for ‘black girls’ she made in 2011 came up with plenty of hardcore pornography hits. She lists the websites included and a picture of the Google search, as well as the images at the top of the screen which are also sexual. She goes onto prove her agrgument that search engines and computer algorithms are inherently biased, racist, and sexist by focusing on Google and Black people’s experience. Noble condemns the use of private interests and advertisement revenue in perpetuating societal discriminatory ideals in what is seen as a neutral and factual way of disseminating information.
Going back to Noble’s introduction, she outlines a specific argument that runs throughout the whole book regarding the pornification of Black women and other minorities. I felt uncomfortable reading some of the language of the book, as she didn’t shy away graphic words used in porn or showing the screenshots of what her google search pulled up. Noble’s point required her to state these words and make us uncomfortable. Her point is uncomfortable; search engines and algorithms perpetuate and reinforce racism and discrimination. This example sets the stage for other (less salacious) issues present in human-created algorithms, search engines that most trust, and Google itself as a private company. Shock value can work people.
The core of this monograph is centered around racism in information sciences that affect Black women, but the point Noble argues is vast compared to this one problem addressed above. Other examples throughout the book, such as Google searches of the word “Jew” leading to anti-Semitic websites, though Google blocked Nazi memorabilia from appearing in certain countries, show a wider-range of issues. Issues that include removing information, as “this indicates that Google can in fact remove objectionable hits, it is equally troubling, because the company provided search results without informing searchers that information was being deleted” (Noble, 45). Google added a disclaimer regarding search results of “Jew,” yet didn’t remove hits. Interesting, yes; scary in lots of ways. In Chapter Three, Noble also writes about Dylann Roof, a white supremacist who murdered nine Black people at Emanuel African Episcopal Methodist Church in South Caroline in 2015. Roof used Google searches to formulate his racist ideals. Googling “black on white crimes” lead his to the site of Council of Conservative Citizens – “a modern reincarnation of the old White Citizens Council” which fought segregation (Noble, 112). This search led him to no reputable websites that discussed intraracial violence, or at least weren’t ‘cloaked websites.’
Discriminatory search algorithms do more than project problematic media ideals about marginalized groups. It perpetuates racism, sexism, and discrimination, making it profitble for companies such as Google. It allows people to easily find problematic and inaccurate sources with certain keywords that can help create violence and strife, as “[k]nowledge management reflects the same social biases that exist in society, because human beings are at the epicenter of information curation” (Noble, 141).
Noble’s Call-To-Action centers around the idea of having more Black and minority people in the tech industry, where there is a distinct dearth. She also argues for tech design to include people who have studied marginalized groups and their histories, such as scholars of African-American history or women and gender studies, and an education in understanding that web searches ARE NOT neutral. In “Algorithmic Accountability: A Primer,” with authors Robyn Caplyn, Joan Donovan, Lauren Hanson, and Jeanna Matthews, the understanding of algorithmic issues feeds of Noble’s work and others. Writing “[c]ritically, algorithms do not make mistakes, humans do” (Caplyn, Donovan, Hanson, Matthews, 22). These scholars similarly understand that lack of oversight and capitalism perpetuates the issue, and call for individual auditing of algorithm from people such as journalists and also argue for more government intervention. Noble, as well, calls for “public search engine alternatives, united with public-interest journalism and librarianship, to ensure that the public has access to the highest quality information available” (Noble, 152).
Noble’s point is an overarching call to scholars and non-scholars, as everyone uses the Internet, Google, and algorithms. For historians, primary research often begins with Google, even if it includes googling articles and historians to see their work. By relying on search engine algorithms to do this work for us, we are relying on a problematic system with roots in sexism, racism, and discrimination (For an article about issues with JStor’s Topics, including getting rid of the topic ‘women’ for a period of time, read Sharon Block’s “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories” from Digital Humanities Quarterly.) To truly fix the issue needs systemic changes, as Noble’s call-to-action includes, but for historians, understanding the issue is a good first step. First step only, but one I needed to learn about to even take.
Block, Sharon. “Erasure, Misrepresentation and Confusion: Investigating JSTOR Topics on Women’s and Race Histories,” Digital Humanities Quarterly 14, no 1 (2020).
Caplan, Robyn et. al, “Algorithmic Accoutability: A Primer,” 2018.
Noble, Safiya Umoja, Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press, 2018.
Ziegler, S. L. “Open Data in Cultural Heritage Institutions: Can We Be Better Than Data Brokers?” Digital Humanities Quarterly 14, no 2 (2020).